everyone
since 19 Mar 2025">EveryoneRevisionsBibTeXCC BY 4.0
Efficiently performing probabilistic inference in large models is a significant challenge due to the high computational demands and continuous nature of the model parameters. At the same time, the ML community has put effort into quantifying parameters of large-scale models to increase their computational efficiency. We extend this work by proposing a method for learning the probability distributions of quantized parameters via variational inference (VI). This enables effective learning of continuous distributions in a discrete space. We consider both 2D densities and quantized neural networks, where we introduce a tractable learning approach using probabilistic circuits. This method offers a scalable solution to manage complex distributions and provides clear insights into model behavior. We validate our approach in various settings, demonstrating its effectiveness.