Leveraging Generative Foundation Models for Domain Generalization

Published: 03 Jul 2024, Last Modified: 15 Jul 2024ICML 2024 FM-Wild Workshop PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Generative Foundation Models, Domain Generalization, Diffusion Models
TL;DR: Cross Domain Generative Augmentation
Abstract: There has been a huge effort to tackle the Domain Generalization (DG) problem with a focus on developing new loss functions. Inspired by the capabilities of the diffusion models, we pose a pivotal question: Can diffusion models function as data augmentation tools to address DG from a data-centric perspective, rather than relying on the loss functions? We show that trivial cross domain data augmentation (CDGA) along with the vanilla ERM using readily available diffusion models outperforms state-of-the-art (SOTA) DG algorithms. To justify the success of CDGA, we experimentally show that CDGA reduces the distribution shift between domains which is the main reason behind the lack of out-of-distribution (OOD) generalization of ERM under domain shift. These results advocate for further investigation into the potential of SOTA generative models for tackling the representation learning problem.
Submission Number: 8
Loading