Abstract: Massive backpropagated models can outperform humans on a variety of tasks but suffer from high power consumption and poor generalisation. Local learning, which focuses on updating subsets of a model's parameters at a time, has emerged as a promising technique to address these issues. Recently, a novel local learning algorithm, called Forward-Forward, has received widespread attention due to its innovative approach to learning. Unfortunately, its application has been limited to smaller datasets due to scalability issues. To this end, we propose The Trifecta, a collection of three simple techniques that drastically improve the Forward-Forward algorithm on deeper networks. Our experiments demonstrate that our models are on par with similarly structured, backpropagation-based models in both training speed and test accuracy on simple datasets. Specifically, we achieve around 84\% accuracy on CIFAR-10, a notable improvement (25%) over the original FF algorithm.
Submission Length: Regular submission (no more than 12 pages of main content)
Previous TMLR Submission Url: https://openreview.net/forum?id=WuCaFyDHFs
Changes Since Last Submission: changed title
Code: https://github.com/tdooms/trifecta
Assigned Action Editor: ~Pablo_Samuel_Castro1
Submission Number: 3146
Loading