Adaboost-Based Local-Forest Adversarial Learning for Imbalanced Domain Adaptation

18 Sept 2025 (modified: 12 Feb 2026)ICLR 2026 Conference Desk Rejected SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Imbalanced domain adaptation, adaboost, adversarial learning, sample interpolation
TL;DR: We propose a framework that integrates matching-driven adaptive sampling, local discriminator ensembles, and interpolation-based augmentation to address imbalanced domain adaptation.
Abstract: Class imbalance poses a significant challenge in unsupervised domain adaptation (UDA). We propose Adaboost-based Local-Forest Adversarial Learning (ALFAL), a framework that leverages sample-wise label matching rates to guide both adaptive sampling and interpolation-based generation. ALFADA first samples informative instances and constructs a local discriminative forest (LDF) via clustering to enable fine-grained regional alignment on the basis of the global discriminator in the framework of domain adversarial learning. To further enhance adaptation for minority classes, a Boosted Pairwise Interpolation Generator (BPIG) synthesizes interpolation samples between high-weight source and confident target instances. These auxiliary samples are optimized through an adversarial learning-based exploration mechanics to explore challenging regions. Experiments demonstrate that ALFADA consistently outperforms existing state-of-the-art methods on imbalanced domain adaptation benchmarks.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 11694
Loading