Keywords: Neuroscience, Predictive Coding, Meta Predictive Coding, Free Energy
TL;DR: Meta-PCN fixes deep PCN training instabilities—PE imbalance and exploding/vanishing prediction errors—via a meta-PE loss and weight variance regularization, yielding statistically significant gains on CIFAR-10/100 and TinyImageNet.
Abstract: Predictive Coding Networks (PCNs) offer a biologically inspired alternative to conventional deep neural networks.
However, their scalability is hindered by severe training instabilities that intensify with network depth.
Through dynamical mean-field analyses, we identify two fundamental pathologies that impede deep PCN training:
(1) prediction error (PE) imbalance that leads to uneven learning across layers, characterized by error concentration at network boundaries; and
(2) exploding and vanishing prediction errors (EVPE) sensitive to weight variance.
To address these challenges, we propose Meta-PCN, a unified framework that incorporates two synergistic components:
(1) loss based on meta-prediction errors, which minimizes PEs of PEs to linearize the nonlinear inference dynamics; and
(2) weight regularization that combines normalization and clipping to regulate weight variance and mitigate EVPE.
Extensive experimental validation on CIFAR-10/100 and TinyImageNet demonstrates that Meta-PCN statistically significant improvements over conventional PCN and backpropagation across most architectures, while maintaining biological plausibility.
Supplementary Material: zip
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 16165
Loading