Approximate Message Passing on General Factor Graphs using Shallow Neural Networks

Published: 10 Jun 2025, Last Modified: 15 Jul 2025MOSS@ICML2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Message Passing, Probabilistic Machine Learning, Sampling, Factor Graphs, Shallow Neural Networks
TL;DR: We extend the framework of factor graphs to a wider set of poblem domains by approximating message update equations without the need for analytical integration and validate our approach using factors with known closed-form solutions.
Abstract: Factor graphs offer an efficient framework for probabilistic inference through message passing, with the added benefit of uncertainty quantification, which is crucial in safety-critical applications. However, their applicability is limited by the need to analytically solve update equations for factors, which are problem-specific and may involve intractable integrals. We propose to approximate the message update equations of individual factors with shallow neural networks, which we train on data generated by sampling from the respective factor equations, to capture complex factor relationships while maintaining computational tractability.
Code: ipynb
Submission Number: 82
Loading