Factor Graph Optimization for Belief Propagation Decoding

ICLR 2026 Conference Submission11803 Authors

18 Sept 2025 (modified: 23 Nov 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Binary Programming, Belief Propagation, Structure Learning
TL;DR: We provide a gradient-based approach for structure learning of the Bayesian Graph underlying Belief Propagation
Abstract: Belief Propagation (BP) is a highly efficient message-passing algorithm for inference on graphical models, famously applied to the decoding of sparse codes. The performance of BP, however, is critically dependent on the structure of the underlying factor graph. Designing a graph structure that is optimal for BP decoding remains a significant challenge, especially when constrained by short block lengths or novel channel models. In this work, we introduce, for the first time, a gradient-based and data-driven framework to directly optimize the factor graph for the Belief Propagation algorithm. We learn locally optimal graph structures by running simulations under channel noise. This is enabled by a novel, complete graph tensor representation of the Belief Propagation algorithm, which makes the decoding process end-to-end differentiable. This representation allows us to optimize the graph structure over finite fields via backpropagation, coupled with an efficient line-search method. When applied to the design of sparse codes, the resulting BP-optimized factor graphs demonstrate decoding performance that outperforms existing popular codes and show the power of data-driven approaches for code design.
Supplementary Material: zip
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 11803
Loading