Structured Robustness for Distribution Shifts

Published: 06 Mar 2025, Last Modified: 01 Apr 2025SCSL @ ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Track: regular paper (up to 6 pages)
Keywords: Distributionally Robust Optimization, Transformation-Invariance, Mirror Descent
TL;DR: We propose a min–max framework—modeling label-preserving transformations and adversarial reweighting within an f-divergence—to achieve robust out-of-distribution generalization with theoretical guarantees and empirical validation.
Abstract: Out-of-distribution (OOD) data often undermines reliable model deployment in high-stakes domains such as financial markets, where overlooked correlations and unexpected shifts can render predictive systems ineffective. We propose STAR (Structured Transformations and Adversarial Reweighting), a framework that leverages the geometry of distribution shifts by combining transformation- based invariances with divergence-based robust optimization. Specifically, STAR places an f -divergence ball around each label-preserving transformation of the training sample, empowering an adversary to apply known transformations and reweight the resulting data within a specified divergence radius. This design cap- tures both large, structured shifts and subtle, unmodeled perturbations—a critical step toward mitigating shortcuts and spurious correlations. Notably, STAR recov- ers standard distributionally robust optimization if no structured transformations are assumed. We establish a uniform-convergence analysis showing that minimiz- ing STAR’s empirical nested min–max objective achieves low worst-case error over all admissible shifts with high probability. Our results quantify the additional samples needed to handle the adversary’s flexibility, providing theoretical guid- ance for selecting the divergence radius based on problem complexity. Empirical studies on synthetic and image benchmarks confirm that STAR outperforms base- lines, consistent with our theoretical findings.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Presenter: ~Erfan_Darzi1
Format: No, the presenting author is unable to, or unlikely to be able to, attend in person.
Funding: No, the presenting author of this submission does *not* fall under ICLR’s funding aims, or has sufficient alternate funding.
Submission Number: 56
Loading