Keywords: Transfer Learning, Multi-Domain Learning, Multi-Task Learning
TL;DR: We propose a novel fine-grained parameter-sharing method for efficient and comprehensive knowledge transfer, addressing issues with current coarse-grained sharing solutions.
Abstract: Knowledge transfer aims to apply existing knowledge to different tasks or new data, and it has extensive applications in multi-domain and multi-task learning.
The key to this task is quickly identifying a fine-grained object for knowledge sharing and efficiently transferring knowledge.
Current methods, such as fine-tuning, layer-wise parameter sharing, and task-specific adapters, only offer coarse-grained sharing solutions and struggle to effectively search for shared parameters, thus hindering the performance and efficiency of knowledge transfer.
To address these issues, we propose Channel-Wise Parameter Sharing (CWPS), a novel fine-grained parameter-sharing method for Knowledge Transfer, which is efficient for parameter sharing, comprehensive, and plug-and-play.
For the coarse-grained problem, we first achieve fine-grained parameter sharing by refining the granularity of shared parameters from the level of layers to the level of neurons. The knowledge learned from previous tasks can be utilized through the explicit composition of the model neurons.
Besides, we promote an effective search strategy to minimize computational costs, simplifying the process of determining shared weights.
In addition, our CWPS has strong composability and generalization ability, which theoretically can be applied to any network consisting of linear and convolution layers.
We introduce several datasets in both incremental learning and multi-task learning scenarios. Our method has achieved state-of-the-art precision-to-parameter ratio performance with various backbones, demonstrating its efficiency and versatility.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 13647
Loading