$MC^2$: Multimodal Concept-based Continual learning

22 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Concept-based Models, Continual Learning, Interpretability
TL;DR: A novel approach to create multimodal embeddings grounded to human-defined concepts for interpretable classification.
Abstract: The inability of deep neural networks to learn continually while retaining interpretability limit their deployment in critical settings. Existing research has made strides in either interpretability or continual learning, but the synergy of these two directions largely remains under-explored. This work examines this intersection from the perspective of concept-based models where classes are considered as combinations of text-based concepts, and thus can enhance the interpretability of models in a continual learning setting. Addressing the unique challenges of learning new concepts without forgetting past ones, our method $\mathbf{MC^2}$ proposes an approach to seamlessly learn both classes and concepts over time. We adopt a multimodal approach to concepts, emphasizing text-based human-understandavle semantics associated with images. Through various experimental studies, we demonstrate that $\mathbf{MC^2}$ outperforms existing concept-based approaches by a large margin in a continual setting, while performing comparably if not better in full-data settings. We also demonstrate that $\mathbf{MC^2}$ can be used as a post-hoc interpretability method to examine image regions associated with abstract textual concepts. Our code for $\mathbf{MC^2}$ will be publicly released on acceptance.
Primary Area: visualization or interpretation of learned representations
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5397
Loading