CFP last date
01 May 2024
Call for Paper
June Edition
CAE solicits high quality original research papers for the upcoming June edition of the journal. The last date of research paper submission is 01 May 2024

Submit your paper
Know more
Reseach Article

An Incremental Ensemble of Classifiers based on Hypothesis Strength and Ambiguity Grade

by P. R. Deshmukh, Roshani Ade
Communications on Applied Electronics
Foundation of Computer Science (FCS), NY, USA
Volume 4 - Number 3
Year of Publication: 2016
Authors: P. R. Deshmukh, Roshani Ade
10.5120/cae2016652040

P. R. Deshmukh, Roshani Ade . An Incremental Ensemble of Classifiers based on Hypothesis Strength and Ambiguity Grade. Communications on Applied Electronics. 4, 3 ( January 2016), 10-14. DOI=10.5120/cae2016652040

@article{ 10.5120/cae2016652040,
author = { P. R. Deshmukh, Roshani Ade },
title = { An Incremental Ensemble of Classifiers based on Hypothesis Strength and Ambiguity Grade },
journal = { Communications on Applied Electronics },
issue_date = { January 2016 },
volume = { 4 },
number = { 3 },
month = { January },
year = { 2016 },
issn = { 2394-4714 },
pages = { 10-14 },
numpages = {9},
url = { https://www.caeaccess.org/archives/volume4/number3/507-2016652040/ },
doi = { 10.5120/cae2016652040 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2023-09-04T19:54:03.481228+05:30
%A P. R. Deshmukh
%A Roshani Ade
%T An Incremental Ensemble of Classifiers based on Hypothesis Strength and Ambiguity Grade
%J Communications on Applied Electronics
%@ 2394-4714
%V 4
%N 3
%P 10-14
%D 2016
%I Foundation of Computer Science (FCS), NY, USA
Abstract

The massive amount of raw student’s data in the education organization can be converted into the information and buried knowledge can be taken out of it for the purpose various applications related to students. As the student’s data in the educational systems is increasing day by day, so instead of classical batch learning algorithm, incremental learning algorithm tries to forget unrelated information while training fresh examples. Now a days, combining classifiers is nothing but taking more than one opinion contributes a lot, to get more accurate results. Therefore, a suggestion is an incremental ensemble of two classifiers namely Naïve Bayes, K-Star using voting scheme based on hypothesis strength and ambiguity grade. The voting rule proposed in this paper is compared with the existing majority voting rule for the student’s data set.

References
  1. Roshani Ade, Dr. P. R. Deshmukh, “Classification of Students using Psychometric Tests with the help of Incremental Naïve Bayes Algorithm”, International Journal of Computer Application, Foundation of Computer Science, USA, vol 89, no. 14, pp. 27-31 March,2014.
  2. Roshani Ade, Dr. P. R. Deshmukh, “An incremental ensemble of classifiers as a technique for prediction of student's career choice”, published in IEEE International conference on network and soft computing, pp. 427-430, ICNSC, July 2014.
  3. Roshani Ade, Dr. P. R. Deshmukh, “Classification of students by using an incremental ensemble of classifiers”, published in 3rd IEEE International Conference On Reliability, Infocom Technologies and optimization, pp. 61-65, ICRITO- 8-10 Oct 2014.
  4. Roshani Ade, Dr. P. R. Deshmukh, “Instance based vs Batch based incremental learning approach for Students Classification, International Journal of Computer Application, Foundation of Computer Science, USA, vol. 106, no. 3, pp. – Nov 2014.
  5. Roshani Ade, Dr. P. R. Deshmukh, “Efficient Knowledge Transformation System Using Pair of Classifiers for Prediction of Students Career Choice”, International Conference on Information and Communication Technologies, ICICT Dec 2014.
  6. Garrett H.E., Vakils, Feffer and Simons, “Statistics in Psychology and Education,” 1981.
  7. Mayer, JD and Salove P., “The intelligence of Emotional Intelligence,” pp. 433-442, 1993.
  8. Robi Polikar, “Ensemble based syste.ms in decision making,” IEEE Circuits and systems magazine, vol. 6, no. 3, pp. 21-45, Third Quarter, 2006.
  9. C. Ji and S. Ma, “Combination of weak classifiers,” IEEE Trans. Neural Networks., vol. 8, no. 1, pp. 32–42, Jan. 1997.
  10. Kittler,M. Hatef, R. P. Duin, and J. Matas, “On combining classifiers,” IEEE Trans. Pattern Anal. Machine Intelligence. vol. 20, pp. 226–239, 1998.
  11. D. H. Wolpert, “Stacked generalization,” Neural Network., vol. 5, no. 2, pp. 241–259, 1992.
  12. T. K. Ho, J. J. Hull, and S. N. Srihari, “Decision combination in multiple classifier system,” IEEE Transaction on pattern analy. Machine Intel.,vol. 16, no.1, pp. 66-75, 1994.
  13. R. Polikar, “Bootstrap-Inspired techniques in computation intelligence,” IEEE Signal Process. Mag., vol. 24, no.4, pp. 59-72, Jul. 2007.
  14. L. I. Kuncheva, “Classifier ensembles for detecting concept change in streaming data: Overview and perspectives,” in Proc. Eur. Conf. Artif.Intell., 2008, pp. 5–10.
  15. M. Muhlbaier, A. Topalis, and R. Polikar, “Learn++.NC: Combining ensemble of classifiers with dynamically weighted consult-and-vote for efficient incremental learning of new classes,” IEEE Trans. Neural Netw.,vol. 20, no. 1, pp. 152–168, Jan. 2009.
  16. C. Giraud-Carrier, “A Note on the Utility of Incremental Learning,” Artificial Intelligence Comm., vol. 13, no. 4, pp. 215-223, 2000.
  17. Sylvain Roy, “Nearest Neighbor With Generalization,” University of Canterbury, Christchurch, New Zealand, 2002.
  18. Cleary, John G., and Leonard E. Trigg. “K*: An Instance-based Learner Using an Entropic Distance Measure,” ICML, 1995.
  19. Witten I. H, Frank E, Hall MA, “Data Mining Practical Machine Learning Tools and Techniques,” Third Edition, Elsevier, 2011.
  20. Haibo He, Yuan Cao, “SSC: A Classifier Combination Method Based on Signal Strength”, IEEE Transaction on Neural Networks and Learning Systems, Vol. 23, No., July 2012.
Index Terms

Computer Science
Information Sciences

Keywords

incremental learning ensemble voting rule.