Efficient Training of CNN Ensembles via Feature-Prioritized Boosting

Published: 22 Sept 2025, Last Modified: 01 Dec 2025NeurIPS 2025 WorkshopEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Optimization, Important Sampling, CNN
TL;DR: Subgrid-boosted CNN ensembles improve training speed and accuracy by focusing on informative features and integrating boosting into the learning process.
Abstract: Convolutional Neural Networks (CNNs) have achieved remarkable success in computer vision, yet training deep architectures with millions of parameters remains computationally expensive and design intensive. We present a novel framework for efficient optimization of CNN ensembles that integrates subgrid-boosted feature selection with boosting-inspired learning. Our method introduces subgrid selection and importance sampling to emphasize statistically informative regions of the feature space, while embedding boosting weights directly into the ensemble training process through a least squares formulation. This design accelerates convergence, reduces the burden of manual architecture tuning, and enhances predictive performance. Across multiple fine-grained image classification benchmarks, our subgrid-boosted CNN ensembles consistently outperform conventional CNNs in both accuracy and training efficiency, demonstrating the effectiveness and generality of the proposed approach.
Submission Number: 124
Loading