Keywords: Noisy Label, Domain Adaptation
Abstract: Unsupervised Domain Adaptation (UDA) becomes especially challenging when source labels are noisy, a situation common in real-world pipelines involving crowdsourcing or automated annotation. Label noise and domain shift jointly cause reliability issues: corrupted labels mislead supervision at the sample level, while ambiguous predictions hinder class-level alignment. Existing methods often address these issues in isolation with static heuristics, leading to fragile adaptation under severe noise. We introduce the Reliability Scheduling Framework (RSF), which unifies noisy-label learning and domain adaptation through multi-scale reliability scheduling. At the sample level, Confidence-Modulated Adaptive Learning (CMAL) dynamically adjusts gradients using an entropy-guided exponent, suppressing noise memorization while retaining strong signals from reliable samples. At the class level, Entropy-Guided Confusion Alignment (EGCA) reweights alignment based on prediction entropy, reducing inter-class confusion and sharpening decision boundaries. Together, CMAL and EGCA coordinate how much to learn and what to align, yielding robust transfer even under heavy label corruption. Extensive experiments on Office-31, Office-Home, and VisDA demonstrate that RSF consistently outperforms prior state-of-the-art methods across symmetric and asymmetric noise settings. These results establish RSF as a principled and effective solution for robust UDA with noisy supervision.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 7923
Loading