ePC: Overcoming Exponential Signal Decay in Deep Predictive Coding Networks

ICLR 2026 Conference Submission7533 Authors

16 Sept 2025 (modified: 23 Dec 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Predictive Coding, Reparametrization, Biologically plausible learning, Energy-based models, Computational neuroscience
TL;DR: We uncover that standard Predictive Coding is highly unsuitable for digital simulation and propose a fully equivalent alternative that overcomes these flaws.
Abstract: Predictive Coding (PC) offers a bio-inspired alternative to backpropagation for neural network training, described as a physical system minimizing its internal energy. However, in practice, PC is predominantly _digitally simulated_, requiring excessive amounts of compute while struggling to scale to deeper architectures. This paper reformulates PC to overcome this hardware-algorithm mismatch. First, we uncover how the canonical state-based formulation of PC (sPC) is, by design, deeply inefficient in digital simulation, inevitably resulting in exponential signal decay that stalls the entire minimization process. Then, to overcome this fundamental limitation, we introduce error-based PC (ePC), a novel reparameterization of PC which does not suffer from signal decay. Though no longer biologically plausible, ePC numerically computes exact PC weights gradients and runs orders of magnitude faster than sPC. Experiments across multiple architectures and datasets demonstrate that ePC matches backpropagation's performance even for deeper models where sPC struggles. Besides practical improvements, our work provides theoretical insight into PC dynamics and establishes a foundation for scaling PC-based learning to deeper architectures on digital hardware and beyond.
Supplementary Material: zip
Primary Area: applications to neuroscience & cognitive science
Submission Number: 7533
Loading