Abstract: We derive a fundamental trade-off between standard and adversarial risk in a rather general situation that formalizes the following simple intuition:
``If no (nearly) optimal predictor is smooth, adversarial robustness comes at the cost of accuracy.''
As a concrete example, we evaluate the derived trade-off in regression with polynomial ridge functions under mild regularity conditions.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: The major changes to the manuscript are as follows:
- A paper on adversarial linear regression that one of the reviewer's mentioned is now discussed in the related work
- Two subsections are added at the end of introduction to elaborate on the contributions & remarks, and future directions
- The statement of the Theorem is revised to include a slightly more general form, as well as the original simplified version
- The latter part of Section 3 is expanded to discuss $\ell_2$ perturbation in linear regression as well as the effect of the non-linearity as parameterized by the parameter $p$.
Assigned Action Editor: ~Han_Bao2
Submission Number: 3620
Loading