Beyond Boundaries: A Novel Data-Augmentation Discourse for Open Domain Generalization

Published: 04 Dec 2023, Last Modified: 04 Dec 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: The problem of Open Domain Generalization (ODG) is multifaceted, encompassing shifts in domains and labels across all source and target domains. Existing approaches have encountered challenges such as style bias towards training domains, insufficient feature-space disentanglement to highlight semantic features, and discriminativeness of the latent space. Additionally, they rely on a confidence-based target outlier detection approach, which can lead to misclassifications when target open samples visually align with the source domain data. In response to these challenges, we present a solution named \textsc{ODG-Net}. We aim to create a direct open-set classifier within a \textit{discriminative}, \textit{unbiased}, and \textit{disentangled} semantic embedding space. To enrich data density and diversity, we introduce a generative augmentation framework that produces \textit{style-interpolated} novel domains for closed-set images and novel pseudo-open images by \textit{interpolating the contents of paired training images}. Our augmentation strategy skillfully utilizes \textit{disentangled style and content information} to synthesize images effectively. Furthermore, we tackle the issue of style bias by representing all images in relation to all source domain properties, which effectively accentuates complementary visual features. Consequently, we train a multi-class semantic object classifier, incorporating both closed and open class classification capabilities, along with a style classifier to identify style primitives. The joint use of style and semantic classifiers facilitates the disentanglement of the latent space, thereby enhancing the generalization performance of the semantic classifier. To ensure discriminativeness in both closed and open spaces, we optimize the semantic feature space using novel metric losses. The experimental results on six benchmark datasets convincingly demonstrate that \textsc{ODG-Net} surpasses the state-of-the-art by an impressive margin of $1-4\%$ in both open and closed-set DG scenarios.
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: We have modified the revised manuscripts in the following parts 1. We have given more space around the Fig 2 and refined it with bigger margins 2. Figure 4 is enlarged for better viewing 3. We have split table1 and table 2 and enlarged them 4. We have enlarged tables 1,2,3,4,5,6,7 and 8 for better readability
Supplementary Material: pdf
Assigned Action Editor: ~Yannis_Kalantidis2
Submission Number: 1460