Emergent representations in networks trained with the Forward-Forward algorithm
Keywords: Forward-Forward, Representations, Sensory cortex, Ensembles, Sparsity, Backpropagation
TL;DR: Neural networks trained with the Forward-Forward algorithm exhibit sparse representations that resemble those found in sensory cortex.
Abstract: The Backpropagation algorithm, widely used to train neural networks, has often been criticised for its lack of biological realism. In an attempt to find a more biologically plausible alternative, and avoid to back-propagate gradients in favour of using local learning rules, the recently introduced Forward-Forward algorithm replaces the traditional forward and backward passes of Backpropagation with two forward passes. In this work, we show that internal representations obtained with the Forward-Forward algorithm can organize into robust, category-specific ensembles, composed by an extremely low number of active units (high sparsity). This situation is reminiscent of what has been observed in cortical sensory areas, where neuronal ensembles are suggested to serve as the functional building blocks for perception and action. Interestingly, while these ensembles do not typically arise in models trained with standard Backpropagation, they can manifest in networks optimized by Backpropagation, given the same training objective as that of the Forward-Forward algorithm. These findings suggest that the learning procedure proposed by Forward-Forward may surpass Backpropagation in its capacity to model learning in the cortex, even when a backward pass is used, and may inspire new approaches to compare representations in biological and artificial neural networks.
Primary Area: visualization or interpretation of learned representations
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9198
Loading