Abstract: Back-propagation is a widely used algorithm for training neural networks by adjusting weights based on error gradients. However, back-propagation is biologically implausible with global derivative computation and lacks robustness in long-term dynamic learning. A previously proposed alternative to back-propagation is the Forward-Forward algorithm, which bypasses global gradient dependency and localises computations, making it a more biologically plausible approach. However, Forward-Forward has been evaluated in limited environments, does not yet match back-propagation's performance, and only supports classification, not regression. This research introduces the Metamorphic Forward Adaptation Network (MFAN), using a contrastive learning property as its core, and retaining the layer-wise architecture of the Forward-Forward algorithm. Compared to the Forward-Forward model being limited to discrete classification, MFAN can process discrete and continuous data, showing stability, adaptability, and the ability to handle evolving data. MFAN performs well in continuous data stream scenarios, demonstrating superior adaptability and robustness compared to back-propagation, particularly in tasks requiring dynamic, long-term learning.
Submission Length: Long submission (more than 12 pages of main content)
Changes Since Last Submission: We clarified: (i) robustness and sensitivity analysis; (ii) CIFAR‑10 results as requested. We also re‑checked the consistency, grammar and updated the manuscript and added a new version of Fig. 1 for clarity.
Video: https://youtu.be/nGv8Nno2hsw
Code: https://github.com/AerialRoboticsGroup/mfan
Assigned Action Editor: ~Mykola_Pechenizkiy1
Submission Number: 3782
Loading