Stratified over-sampling bagging method for random forests on imbalanced data

He Zhao, Xiaojun Chen, Tung Nguyen, Joshua Zhexue Huang, Graham Williams, Hui Chen

Published: 01 Jan 2016, Last Modified: 15 Jan 2026Intelligence and Security Informatics - 11th Pacific Asia Workshop, PAISI 2016, ProceedingsEveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Imbalanced data presents a big challenge to random forests (RF). Over-sampling is a commonly used sampling method for imbalanced data, which increases the number of instances of minority class to balance the class distribution. However, such method often produces sample data sets that are highly correlated if we only sample more minority class instances, thus reducing the generalizability of RF. To solve this problem, we propose a stratified over-sampling (SOB) method to generate both balanced and diverse training data sets for RF. We first cluster the training data set multiple times to produce multiple clustering results. The small individual clusters are grouped according to their entropies. Then we sample a set of training data sets from the groups of clusters using stratified sampling method. Finally, these training data sets are used to train RF. The data sets sampled with SOB are guaranteed to be balanced and diverse, which improves the performance of RF on imbalanced data. We have conducted a series of experiments, and the experimental results have shown that the proposed method is more effective than some existing sampling methods.
Loading