Continual Learning in Open-vocabulary Classification with Complementary Memory Systems

Published: 15 Oct 2024, Last Modified: 15 Oct 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: We introduce a method for flexible and efficient continual learning in open-vocabulary image classification, drawing inspiration from the complementary learning systems observed in human cognition. Specifically, we propose to combine predictions from a CLIP zero-shot model and the exemplar-based model, using the zero-shot estimated probability that a sample's class is within the exemplar classes. We also propose a ``tree probe'' method, an adaption of lazy learning principles, which enables fast learning from new examples with competitive accuracy to batch-trained linear models. We test in data incremental, class incremental, and task incremental settings, as well as ability to perform flexible inference on varying subsets of zero-shot and learned categories. Our proposed method achieves a good balance of learning speed, target task effectiveness, and zero-shot effectiveness. Code is available at https://github.com/jessemelpolio/TreeProbe.
Submission Length: Long submission (more than 12 pages of main content)
Video: https://youtu.be/RkmeJupwNkY
Code: https://github.com/jessemelpolio/TreeProbe
Assigned Action Editor: ~Yanwei_Fu2
Submission Number: 3010
Loading