Local Back-Propagation for Forward-Forward Networks: Independent Unsupervised Layer-Wise Training

Taewook Hwang, Hyein Seo, Sangkeun Jung

Published: 23 Jul 2025, Last Modified: 23 Apr 2026Applied SciencesEveryoneRevisionsCC BY-SA 4.0
Abstract: Recent deep learning models, including GPT-4, have achieved remarkable performance using the back-propagation (BP) algorithm. However, the mechanism of BP is fundamentally different from how the human brain processes learning. To address this discrepancy, the Forward-Forward (FF) algorithm was introduced. Although FF enables deep learning without backward passes, it suffers from instability, dependence on artificial input construction, and limited generalizability. To overcome these challenges, we propose Local Back-Propagation (LBP), a method that integrates layer-wise unsupervised learning with standard inputs and conventional loss functions. Specifically, LBP demonstrates high training stability and competitive accuracy, significantly outperforming FF-based training methods. Moreover, LBP reduces memory usage by up to 48% compared to convolutional neural networks trained with back-propagation, making it particularly suitable for resource-constrained environments such as federated learning. These results suggest that LBP is a promising biologically inspired training method for decentralized deep learning.
Loading