SVM Ensembles on a Budget

Published: 01 Jan 2022, Last Modified: 03 Oct 2024ICANN (4) 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This paper presents a model to train an ensemble of SVMs that achieves better generalization performance at a lower computational training cost than a single SVM. The idea of the proposed model is, instead of training a single SVM on the whole dataset, to train a diverse set of simpler SVMs. Specifically, the proposed algorithm creates B subensembles of T SVMs using a different set of hyper-parameters in each subensemble. Then, in order to gain more diversity, the T SVMs of each of the subsensembles are trained on a different 1/T disjoint fraction of the training set. The paper presents an extensive analysis of the computational training complexity of the algorithm. The experiments show that for any given computational budget, the presented method obtains a better generalization performance than a single SVM.
Loading