Balancing Selection and Diversity in Ensemble Learning with Exponential Mixture Model

Published: 01 Jan 2023, Last Modified: 13 May 2025ICANN (3) 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In practical machine learning scenarios, there may be multiple predictors available for the same task. Ensemble learning combines these predictors to obtain a predictor with higher generalization performance. Weighted averaging is one of the most basic methods, which can be generalized by a formulation of an exponential mixture model. In this formulation, the weight optimization in ensemble learning is represented as selecting the predictors to be used by concentrating the weights and maintaining the diversity of the predictors to be used by distributing the weights. It has been theoretically shown that if the balance between these two factors is adjusted to be equal, the generalization performance improves. However, having an equal balance may not always be optimal, as there could be better alternatives. In this paper, we propose a method to obtain a predictor with higher generalization performance by adjusting the balance between selecting predictors and maintaining their diversity. Numerical experiments showed that when there is a large amount of training data and an unbiased label distribution, adjusting the balance can result in improved generalization performance.
Loading