Heteroscedastic Variational Bayesian Last Layers: Modeling Input-Dependent Noise in Sparse-Data Regression

ICLR 2026 Conference Submission17508 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Bayesian Neural Networks, Variational Bayesian Last Layer, Heteroscedastic Noise, Sparse Data Regression, Uncertainty Quantification
TL;DR: We propose Heteroscedastic VBLL to model input-dependent noise in sparse-data regression, improving uncertainty quantification for real-world industrial applications.
Abstract: Bayesian Neural Networks (BNNs) have been extensively studied for uncertainty quantification. To train BNNs efficiently, Variational Bayesian Last Layer (VBLL) provides a sampling-free, deterministic method, significantly reducing computational cost. However, these existing methods assume homoscedastic noise and sufficient data, while real-world industrial applications frequently encounter heteroscedastic noise, where the uncertainty level (i.e., noise) varies with input, and collecting training data in such cases is often expensive. Modeling heteroscedastic noise with sparse data is challenging, but it plays a critical role in setting appropriate safety margins for industrial applications. In this work, we propose Heteroscedastic VBLL (HVBLL) to effectively capture the input-dependent noise. We showcase the impact of noise prior on sparse-data regression, and further design a clustering-based noise level estimation method to provide reliable priors. Experimental results demonstrate that our proposed methods significantly improve the performance of BNNs under heteroscedastic and sparse-data conditions.
Supplementary Material: zip
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 17508
Loading