Call for Paper

CAE solicits original research papers for the December 2021 Edition. Last date of manuscript submission is November 30, 2021.

Read More

Analysis of Classifier Ensembles for Network Intrusion Detection Systems

Neeraj Bisht, Amir Ahmad, A. K. Pant. Published in Security.

Communications on Applied Electronics
Year of Publication: 2017
Publisher: Foundation of Computer Science (FCS), NY, USA
Authors: Neeraj Bisht, Amir Ahmad, A. K. Pant
10.5120/cae2017652516

Neeraj Bisht, Amir Ahmad and A K Pant. Analysis of Classifier Ensembles for Network Intrusion Detection Systems. Communications on Applied Electronics 6(7):47-53, February 2017. BibTeX

@article{10.5120/cae2017652516,
	author = {Neeraj Bisht and Amir Ahmad and A. K. Pant},
	title = {Analysis of Classifier Ensembles for Network Intrusion Detection Systems},
	journal = {Communications on Applied Electronics},
	issue_date = {February 2017},
	volume = {6},
	number = {7},
	month = {Feb},
	year = {2017},
	issn = {2394-4714},
	pages = {47-53},
	numpages = {7},
	url = {http://www.caeaccess.org/archives/volume6/number7/707-2017652516},
	doi = {10.5120/cae2017652516},
	publisher = {Foundation of Computer Science (FCS), NY, USA},
	address = {New York, USA}
}

Abstract

In this paper an ensemble approach to classify network security data have been presented. The experiments are carried out with Decision trees and Naïve Bayes classifiers and ensemble them with methods like Bagging, Adaboost.M1, Random Forests, MultiBoosting, Rotation Forest and Random Sub Space on NSL KDD dataset which is a modified KDD anomaly detection dataset [1]. Results with different performance measures suggest that no single classification method is the best for all types of datasets on all type of performance measures. The results based on the experiments have been tabulated and their comparative performance suggests that the decision trees ensembles performed better than the Naïve Bayes ensembles. Results also suggest that single decision tree is a good classifier for this data as it has reasonable classification accuracy and less training and testing time.

References

  1. KDD Cup’99 Data. http://www.sigkdd.org. Accessed 15 July 2012
  2. Pfleeger CP (1997) Security in Computing. Prentice Hall, Upper Saddle River, NJ
  3. Lazarevic A, Ertoz L, Kumar V, Ozgur A, Srivastava J (May 2003) A comparative study of anomaly detection schemes in network intrusion detection. In Proceedings of the SIAM International Conference on Data Mining, San Francisco
  4. Amor NB, Benferhat S, Elouedi Z (2004) Naïve Bayes vs. Decision Trees in Intrusion Detection Systems. In Proceedings of ACM Symposium on Applied Computing, Nicosia, Cyprus
  5. Gaddam SR, Phoha VV, Balagani KS (2007) Means+id3 a novel method for supervised anomaly detection by cascading k-means clustering and id3 decision tree learning methods. IEEE Trans Knowl and Data Engg 19(3):345-354
  6. Horng SJ, Su MY, Chen YH, Kao TW, Chen RJ, Lai JL, Perkasa CD (2011) A novel intrusion detection system based on hierarchical clustering and support vector machines. Expert Syst with Appl 38(1):306-313
  7. Sabhnani M, Serpen G (2003) Application of machine learning algorithms to KDD intrusion detection dataset within misuse detection context. In Proceedings of Conference on Machine Learning Models, Technology and Application (pp. 209-215), MLMTA
  8. Tajbakhsh A, Rahmati M, Mirzaei A (2009) Intrusion detection using fuzzy association rules. Appl Soft Comput 9(2):462-469
  9. Hansen LK, Salamon P (1990) Neural network ensembles. IEEE Trans Patt Anal Mach Intel 12: 993-1001
  10. Kuncheva LI (2004) Combining pattern classifiers: Methods and Algorithms. Wiley-IEEE Press, New York
  11. Breiman L (1996) Bagging predictors. Machine Learning 24(2):123–140
  12. Freund Y, Schapire RE (1997) A decision theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci 55:119-139
  13. Webb GI (2000) MultiBoosting: A Technique for Combining Boosting and Wagging. Machine Learning 40(2):159–196
  14. Ho TK (1998) The Random Subspace Method for Constructing Decision Forests. IEEE Trans Patt Anal Mach Intell 20(8):832–844
  15. Breiman L (2001) Random Forests. Machine Learning 45(1):5–32
  16. Tavallaee ME, Bagheri WL, Ghorbani A (2009) A Detailed Analysis of the KDD CUP 99 Data Set. In Proceedings of the Second IEEE Symposium on Computational Intelligence for Security and Defense Applications (CISDA) (pp. 53-58), Piscataway, NJ, USA
  17. Breiman L, Friedman JH, Olshen R, Stone C (1984) Classification and Regression Trees. Chapman and Hall, London
  18. Halawani SM, Alhaddad M, Ahmad A (2012) A study of digital mammograms by using clustering algorithms. J Sci Ind Res 71:594-600
  19. Quinlan JR (1993) C4.5: Programs for machine learning. Morgan Kaufmann, San Mateo
  20. Bishop CM (2006) Pattern Recognition and Machine Learning. Springer-Verlag, New York
  21. Kuncheva LI, Rodriguez JJ (2007) An Experimental Study on Rotation Forest Ensembles. In Proceedings of 7th International Workshop on Multiple Classifier Systems, MCS'07 ( pp. 459-468), Prague, Czech Republic, LNCS 4472
  22. Witten IH, Frank E (2000) Data Mining: Practical Machine Learning Tools with Java Implementations. Morgan Kaufmann, San Francisco

Keywords

network security; NSL KDD; classifier; ensembles; Naïve Bayes; decision trees