|Communications on Applied Electronics
|Foundation of Computer Science (FCS), NY, USA
|Volume 4 - Number 3
|Year of Publication: 2016
|Authors: P. R. Deshmukh, Roshani Ade
P. R. Deshmukh, Roshani Ade . An Incremental Ensemble of Classifiers based on Hypothesis Strength and Ambiguity Grade. Communications on Applied Electronics. 4, 3 ( January 2016), 10-14. DOI=10.5120/cae2016652040
The massive amount of raw student’s data in the education organization can be converted into the information and buried knowledge can be taken out of it for the purpose various applications related to students. As the student’s data in the educational systems is increasing day by day, so instead of classical batch learning algorithm, incremental learning algorithm tries to forget unrelated information while training fresh examples. Now a days, combining classifiers is nothing but taking more than one opinion contributes a lot, to get more accurate results. Therefore, a suggestion is an incremental ensemble of two classifiers namely Naïve Bayes, K-Star using voting scheme based on hypothesis strength and ambiguity grade. The voting rule proposed in this paper is compared with the existing majority voting rule for the student’s data set.