Keywords: Bio-inspried, PEFT, transfer learning
Abstract: Pre-trained Artificial Neural Networks (ANNs) demonstrate robust pattern recognition abilities, closely mirroring the functionality of Biological Neural Networks (BNNs). We are particularly intrigued by these models' capacity for acquiring new knowledge through fine-tuning, such, Parameter-efficient Fine-tuning (PEFT). Given that both ANNs and BNNs propagate information layer-by-layer, a useful analogy can be drawn: ANN weights correspond to synapses in BNNs, while features (latent variables or activations) parallel the neurotransmitters released by neurons. Building upon this clue, we delve deeper into exploring the connections between feature adjustment and weight adjustment, resulting in our proposed method Synapses \& Neurons (SAN) that learns scaling matrices for features and propagates their effects towards posterior weight matrices. Our approach draws strong inspiration from well-known neuroscience phenomena - Long-term Potentiation (LTP) and Long-term Depression (LTD), which also reveal the relationship between synapse development and neurotransmitter release levels. We conducted extensive comparisons of PEFT on 26 datasets using attention-based networks as well as convolution-based networks, leading to significant improvements compared to other tuning methods, +8.5\% over fully-finetune, +7\% over Visual Prompt Tuning, and +3.2\% over Low-Rank Adapter.
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 423
Loading