Spectral-Topological Phase Transitions in Bayesian Neural Networks: Persistent Homology Meets Uncertainty Quantification

07 Mar 2026 (modified: 15 Mar 2026)Submitted to 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Bayesian neural networks, persistent homology, spectral theory, uncertainty quantification, PAC-Bayes, calibration, Fisher information matrix, Tracy-Widom, Marchenko-Pastur
TL;DR: Spectral and topological analysis of BNN training dynamics yields phase transitions that predict calibration quality and an early-stopping criterion.
Abstract: This paper introduces a unified spectral-topological framework for understanding training dynamics in Bayesian Neural Networks (BNNs). We establish a formal connection between the Hessian eigenspectrum of the posterior landscape and persistent homology of the loss surface, revealing phase transitions that correspond to qualitative shifts in uncertainty calibration. Our key contributions are: (1) a spectral phase transition theorem showing that the bulk eigenvalue distribution of the Fisher information matrix undergoes a Marchenko-Pastur to Tracy-Widom transition at critical posterior concentration points; (2) a persistent homology pipeline that tracks Betti numbers of sublevel sets during variational inference, proving these topological invariants predict calibration quality; (3) refined PAC-Bayes generalization bounds that incorporate spectral decay rates, achieving tighter bounds than standard oracle inequalities by a factor of $O(\sqrt{\log d / n})$; (4) a practical early-stopping criterion based on topological persistence that detects convergence 10--50 iterations before standard methods. Experiments on UCI regression, CIFAR-10/100 with MC-Dropout and SWAG demonstrate that our spectral-topological indicators outperform existing calibration metrics (ECE, NLL) for predicting out-of-distribution detection performance, with 15--25\% improvement in AUROC.
Submission Number: 144
Loading