Compositional Risk Minimization

Published: 10 Oct 2024, Last Modified: 25 Dec 2024NeurIPS'24 Compositional Learning Workshop PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Compositional Shifts, Compositional Generalization, Out of Distribution Generalization
TL;DR: Provable method for extrapolating classifiers to novel combinations of attributes (a.k.a. compositional generalization)
Abstract: In this work, we tackle a challenging and extreme form of subpopulation shift, which is termed compositional shift. Under compositional shifts, some combinations of attributes are totally absent from the training distribution but present in the test distribution. We model the data with flexible additive energy distributions, where each energy term represents an attribute, and derive a simple alternative to empirical risk minimization termed compositional risk minimization (CRM). We provide an extensive theoretical analysis of CRM, where we show that our proposal extrapolates to special affine hulls of seen attribute combinations. Empirical evaluations on benchmark datasets confirms the improved robustness of CRM compared to other popular methods designed to tackle various forms of subpopulation shifts.
Submission Number: 28
Loading