Confounder-Free Continual Learning via Recursive Feature Normalization

27 Sept 2024 (modified: 26 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: deep neural networks, confounders, continual learning, invariant representations, statistical regression
TL;DR: We introduce the Recursive Metadata Normalization (R-MDN) layer to learn confounder-invariant feature representations under changing distributions of the data during continual learning.
Abstract: Confounders are extraneous variable that affect both the input and the target, resulting in spurious correlations and biased predictions. Learning feature representations that are invariant to confounders remains a significant challenge in continual learning. To remove the influence of confounding variables from intermediate feature representations, we introduce the Recursive Metadata Normalization (R-MDN) layer, which can be integrated into any stage within deep neural networks (DNNs). R-MDN performs statistical regression via the recursive least squares algorithm to maintain and continually update an internal model state with respect to changing distributions of data and confounding variables. Since R-MDN operates on the level of individual examples, it is compatible with state-of-the-art architectures like vision transformers. Our experiments demonstrate that R-MDN promotes equitable predictions across population groups, both within static learning and across different stages of continual learning, by reducing catastrophic forgetting caused by confounder effects changing over time.
Supplementary Material: zip
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 11025
Loading