Open Access Open Access  Restricted Access Subscription or Fee Access

Experimental Study and Review of Boosting Algorithms

Harshita L. Patel, Amit P. Ganatra, C.K. Bhensdadia, Y.P. Kosta

Abstract


At present, an active research topic is the use of ensembles of classifiers. They are obtained by generating and combining base classifiers, constructed using other machine learning methods. It is an obvious approach to making decisions more reliable to combine the output of different models. Several machine learning techniques do this by learning an ensemble of models and using them in combination: prominent among these are schemes called bagging, boosting, and stacking. They can all, more often than not, increase predictive performance over a single model. And they are general techniques that can be applied to numeric prediction problems and to classification tasks. Bagging, boosting, and stacking have only been developed over the past decade, and their performance is often astonishingly good. In this paper, we do a comparative study of Boosting algorithms and also their performance comparison with Bagging and Stacking. ML algorithms implemented in WEKA (Waikato Environment for Knowledge Analysis) are used for comparative study. Results obtained over different datasets by different algorithms are compared.

Keywords


Bagging, Bayesian Network, Boosting, Classifiers, Ensemble Learning, Feature Selection, Machine learning, MLP (MultiLayer Perceptron), Naïve Bayes Classifier, Predictive Accuracy, SMO (Sequential Minimal Optimization Algorithm for Training a Support Vecto

Full Text:

PDF

References


Data mining - Concept and Techniques by Jiawei Han & Micheline Kamber.

Data mining - Practical Machine Learning Tools and Techniques by Ian H. Witten & Eibe Frank.

www.boosting.org

Pei-Yong Xia, Xiang-Qian Ding, Bai-Ning Jiang, “A Ga Based Feature Selection And Ensemble Learning For High Dimensional Datasets”

Laura Emmanuella A Santana, Diogo F de Oliveira, Anne M P Canuto and Marcilio C P de Souto, “A Comparative Analysis of Feature Selection Methods for Ensembles with Different Combination Methods”

Mrutyunjaya Panda and Manas Ranjan Patra, “Bayesian Belief Networks with Genetic Local Search for Detecting Network Intrusions”

Nikunj C. Oza, Ph.D., “Ensemble Data Mining Methods”

Robi Polikar, “Ensemble based systems in decision making”

Richard Maclin and David Optiz, “An Empirical Evaluation of Bagging and Boosting”

Nikunj Chandrakant Oza, “Online Ensemble Learning”

Freund – “Experiments with a new boosting algorithm”

Freund, Schapire – “A decision-theoretic generalization of on-line learning and an application to boosting”

Friedman, Hastie, etc – “Additive Logistic Regression: A Statistical View of Boosting”

Opitz, Maclin – “Popular Ensemble Methods: An Empirical Study”

Schapire – “The Boosting Approach to Machine Learning: An overview”

Eric Bauer and Ron Kohavi, “An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants”

Geoffrey I. Webb, Robert Schapire, “MultiBoosting: A Technique for Combining Boosting and Wagging”

Xiao Yi Yu1,2, Aiming Wang, “Steganalysis Based on Bayesian Network and Genetic Algorithm”

Kok-Chin Khor, Choo-Yee Ting and Somnuk-Phon Amnuaisuk, “From Feature Selection to Building of Bayesian Classifiers: A Network Intrusion Detection Perspective”

Takuya Kidera, Student Member, IEEE, Seiichi Ozawa, Member, IEEE, and Shigeo Abe, Senior Member, IEEE, “An Incremental Learning Algorithm of Ensemble Classifier Systems”

Mian Zhou & Hong Wei, “Constructing Weak Learner and Performance Evaluation in AdaBoost”

H. Chouaib, O. Ramos Terrades, S. Tabbone, F. Cloppet, N. Vincent, “Feature selection combining genetic algorithm and Adaboost classifiers”

Nima Hatami, Reza Ebrahimpour, “Combining Multiple Classifiers: Diversify with Boosting and Combining by Stacking”

Ashutosh Garg Vladimir PavloviC Thomas S. Huang, “Bayesian Networks as ensemble of Classifiers”

Mariapia Lampis & John Andrews, “Introducing Dynamics in a Fault Diagnostic Application Using Bayesian Belief Networks”

Faten Hussein, Nawwaf Kharma, and Rabab Ward, “Genetic Algorithms for Feature Selection and Weighting, a Review and Study”

Michael C. Lee, Lilla Boroczky1, Kivilcim Sungur-Stasik, Aaron D. Cann, Alain C. Borczuk2, Steven M. Kawut, Charles A. Powell, “A Two-Step Approach for Feature Selection and Classifier Ensemble Construction in Computer-Aided Diagnosis”

Laura E A Santana, Ligia Silva and Anne M P Canuto, “Feature Selection in Heterogeneous Structure of Ensembles: A Genetic Algorithm Approach”

Mr. Kashif Waqas, Dr. Rauf Baig1and, Dr. Shahid Ali, “Feature Subset Selection Using Multi-Objective Genetic Algorithms”


Refbacks

  • There are currently no refbacks.