Low-coherence Subspace Projection: Enhance the Learning Capacity of Orthogonal Projection Methods on Long Task Sequences

18 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: continual learning, learning capacity degradation, orthogonal projection, low-coherence, catastrophic forgetting
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: This paper experimentally observes that Gradient Orthogonal Projection (GOP) methods suffer from the learning capacity degradation problem and proposes a novel method, Low-coherence Subspace Projection (LcSP), to solve it.
Abstract: Gradient Orthogonal Projection (GOP) is an efficient strategy in continual learning to mitigate catastrophic forgetting. Despite its success so far, GOP-based methods often suffer from the learning capacity degradation problem with an increasing number of tasks. To address this problem, we propose a novel and plug-and-play method to learn new tasks in low-coherence subspaces rather than orthogonal subspaces. Specifically, we construct a unified cost function with the DNN parameters lying on the Oblique manifold. A corresponding gradient descent algorithm is developed to jointly minimize the cost function that involves both inter-task and intra-task coherence. We then provide a theoretical analysis to show the advantages of the proposed in the stability and plasticity. Experimental results show that the proposed method has prominent advantages in maintaining the learning capacity, when the number of tasks increases, especially on a large number of tasks, compared with baselines.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1417
Loading