Spectral Decomposed Variational Inference: A Principled Framework for Posterior Covariance Modeling

20 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Variational Inference; Proximal Spectral Optimization; Bayesian Neural Networks; KL-divergence
Abstract: The Kullback-Leibler (KL) divergence, the cornerstone of standard Variational Inference (VI), acts as a blunt instrument for shaping posterior geometries, forcing an unfavorable trade-off between expressivity and scalability. This paper challenges this limitation by proposing a paradigm shift: moving variational optimization from the space of distributions into the spectral domain of the posterior covariance. We introduce Spectral Decomposed Variational Inference (SD-VI), a novel framework built upon a new class of objectives. Instead of the monolithic KL penalty, SD-VI enables explicit, fine-grained regularization of the covariance's eigenspectrum. This new objective is optimized by our efficient and provably convergent Proximal Spectral Optimization (PSO) algorithm, which leverages a sequence of analytical spectral shrinkage steps to automatically discover sparse, low-rank posterior structures in a principled manner. We demonstrate the power of SD-VI on two challenging applications, showing that it learns significantly better-calibrated Bayesian Neural Networks and enables more scalable inference in Sparse Gaussian Processes. Our work establishes a new, powerful approach for building more robust and efficient Bayesian models through direct geometric control of uncertainty.
Supplementary Material: zip
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 23888
Loading