Group Robustness via Adaptive Class-Specific Scaling

24 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: group robustness, debiasing
TL;DR: Class-specific scaling can identify and control the inherent trade-off in existing debiasing method effectively.
Abstract: Group distributionally robust optimization, which aims to improve robust accuracies such as worst-group or unbiased accuracy, is one of the mainstream algorithms to mitigate spurious correlation and handle dataset bias. Existing approaches have apparently improved robust accuracy, but, in fact, these performance gains mainly come from trade-offs at the expense of average accuracy. To control the trade-off flexibly and efficiently, we first propose a simple class-specific scaling strategy, directly applicable to existing debiasing algorithms without additional training. We also develop an instance-wise adaptive scaling technique to overcome the trade-off and improve the performance even further in terms of both accuracies. Our approach reveals that a na\"ive ERM baseline matches or even outperforms the recent debiasing methods by only adopting the class-specific scaling technique. Then, we employ this technique to evaluate the performance of existing algorithms in a comprehensive manner by introducing a novel unified metric that summarizes the trade-off between the two accuracies as a scalar value. By considering the inherent trade-off and providing a performance landscape, our approach delivers meaningful insights into existing robust methods beyond the robust accuracy only. We perform experiments on the datasets in computer vision and natural language processing domains and verify the effectiveness of the proposed frameworks.
Primary Area: societal considerations including fairness, safety, privacy
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8822
Loading