Promoting Ensemble Diversity with Interactive Bayesian Distributional Robustness for Fine-tuning Foundation Models

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: We propose an interactive Bayesian inference method to improve ensemble quality.
Abstract: We introduce Interactive Bayesian Distributional Robustness (IBDR), a novel Bayesian inference framework that allows modeling the interactions between particles, thereby enhancing ensemble quality through increased particle diversity. IBDR is grounded in a generalized theoretical framework that connects the distributional population loss with the approximate posterior, motivating a practical dual optimization procedure that enforces distributional robustness while fostering particle diversity. We evaluate IBDR's performance against various baseline methods using the VTAB-1K benchmark and the common reasoning language task. The results consistently show that IBDR outperforms these baselines, underscoring its effectiveness in real-world applications.
Lay Summary: Bayesian inference is a powerful tool for managing uncertainty in machine learning models. One approach, particle-based Bayesian inference, involves training multiple models - referred to as "particles - and combining their outputs to form an ensemble prediction. However, if these particles are too similar, they may collectively make the same errors, reducing the ensemble's effectiveness. In our research, we propose a method to directly model interactions between these particles to encourage diversity among them, thereby promoting diversity in their predictions and reducing the likelihood of all model particles making identical mistakes. We introduce a theoretical framework that connects this interactive framework to distributional robustness optimization - a concept that ensures models perform reliably under shifting distributions. Building on this theory, we develop a practical optimization technique that simultaneously fosters diversity among particles and enhances the ensemble's robustness. This method guides each particle to explore different aspects of the data while maintaining overall model robustness. We tested our approach on tasks like image classification and commonsense reasoning. The results consistently showed improved performance over existing methods. Further ablation studies revealed that encouraging diversity within the ensemble helps prevent the collective failure of particles, leading to more accurate and dependable predictions.
Link To Code: https://github.com/ngocquanai/IBDR
Primary Area: Probabilistic Methods->Variational Inference
Keywords: Distributional Robustness, Bayesian Inference, Model Finetuning, Promoting Ensemble Diversity
Submission Number: 1376
Loading