Causal Balancing for Domain GeneralizationDownload PDF

Published: 21 Jul 2022, Last Modified: 22 Oct 2023SCIS 2022 PosterReaders: Everyone
Keywords: domain generalization, causality, latent variable model, spurious correlation
TL;DR: We propose a balanced mini-batch sampling strategy to reduce the domain-specific spurious correlations in the observed training distributions for domain generalization.
Abstract: While machine learning models rapidly advance the state-of-the-art on various real-world tasks, out-of-domain (OOD) generalization remains a challenging problem given the vulnerability of these models to spurious correlations. We propose a causally-motivated balanced mini-batch sampling strategy to train robust classifiers that is minimax optimal across a diverse enough environment space, by utilizing multiple training sets from different environments. We provide an identifiability guarantee of the latent covariates in the proposed causal graph and show that our proposed approach samples train data from a balanced, spurious-free distribution under an ideal scenario. Experiments are conducted on three domain generalization datasets, demonstrating empirically that our balanced mini-batch sampling strategy improves the performance of four different established domain generalization model baselines compared to the random mini-batch sampling strategy.
Confirmation: Yes
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](
0 Replies