An Improved AdaBoost Algorithm for Unbalanced Classification Data

Published: 01 Jan 2009, Last Modified: 06 Feb 2025FSKD (1) 2009EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: AdaBoost algorithm is proved to be a very efficient classification method for the balanced dataset with all classes having similar proportions. However, in real application, it is quite common to have unbalanced dataset with a certain class of interest having very small size. It will be problematic since the algorithm might predict all the cases into majority classes without loss of overall accuracy. This paper proposes an improved AdaBoost algorithm called BABoost (Balanced AdaBoost), which gives higher weights to the misclassified examples from the minority class. Empirical results show that the new method decreases the prediction error of minority class significantly with increasing the prediction error of majority class a little bit. It can also produce higher values of margin which indicates a better classification method.
Loading