A Fundamental Accuracy--Robustness Trade-off in Regression and Classification

TMLR Paper3620 Authors

03 Nov 2024 (modified: 11 Mar 2025)Rejected by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: We derive a fundamental trade-off between standard and adversarial risk in a rather general situation that formalizes the following simple intuition: ``If no (nearly) optimal predictor is smooth, adversarial robustness comes at the cost of accuracy.'' As a concrete example, we evaluate the derived trade-off in regression with polynomial ridge functions under mild regularity conditions.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: The major changes to the manuscript are as follows: - A paper on adversarial linear regression that one of the reviewer's mentioned is now discussed in the related work - Two subsections are added at the end of introduction to elaborate on the contributions & remarks, and future directions - The statement of the Theorem is revised to include a slightly more general form, as well as the original simplified version - The latter part of Section 3 is expanded to discuss $\ell_2$ perturbation in linear regression as well as the effect of the non-linearity as parameterized by the parameter $p$.
Assigned Action Editor: ~Han_Bao2
Submission Number: 3620
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview