Forward Target Propagation: A Forward-Only Approach to Global Error Credit Assignment via Local Losses

ICLR 2026 Conference Submission21114 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Forward Learning, Target Propagation, Deep Learning, Local Learning, Biologically Plausible Algorithm, Backpropagation, Neuromorphic Computing, Edge Hardware
Abstract: Training neural networks has traditionally relied on backpropagation (BP), a gradient-based algorithm that, despite its widespread success, suffers from key limitations in both biological and hardware perspectives. These include backward error propagation by symmetric weights, non-local credit assignment, update locking, and frozen activity during backward passes. We propose Forward Target Propagation (FTP), a biologically plausible and computationally efficient alternative that replaces the backward pass with a second forward pass. FTP estimates layer-wise targets using only feedforward computations, eliminating the need for symmetric feedback weights or learnable inverse functions, hence enabling modular and local learning. We evaluate FTP on fully connected networks, CNNs, and RNNs, demonstrating accuracies competitive with BP on MNIST, CIFAR-10, and CIFAR-100, as well as effective modeling of long-term dependencies in sequential tasks. FTP shows improved robustness under quantized low-precision and emerging hardware constraints while also demonstrating substantial efficiency gains over other biologically inspired methods such as target propagation variants and forward-only learning algorithms. With its minimal computational overhead, forward-only nature, and hardware compatibility, FTP provides a promising direction for energy-efficient on-device learning and neuromorphic computing.
Supplementary Material: zip
Primary Area: applications to neuroscience & cognitive science
Submission Number: 21114
Loading