IMPROVING ADVERSARIAL TRAINING WITH MARGIN- WEIGHTED PERTURBATION BUDGET

23 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Adversarial Training, Adversarial Robustness, Adversarial Examples, Trustworthy Machine Learning
Abstract: Adversarial Training effectively improves the robustness of Deep Neural Networks (DNNs) to adversarial attacks. Generally, Adversarial Training involves training DNN models with adversarial examples obtained within a pre-defined, fixed perturbation bound. Notably, individual natural examples from which these adversarial examples are crafted exhibit varying degrees of intrinsic vulnerabilities, and as such, crafting adversarial examples with fixed perturbation radius for all instances may not sufficiently unleash the potency of adversarial training. Motivated by this observation, we propose a simple, computationally cheap reweighting function for assigning perturbation bounds to adversarial examples used for Adversarial Training. We name our approach \textit{Margin-Weighted Perturbation Budget (MWPB)}. The proposed method assigns perturbation radii to individual adversarial samples based on the vulnerability of their corresponding individual natural examples. Experimental results show that the proposed method yields a genuine improvement in the robustness of existing AT algorithms against various adversarial attacks.
Primary Area: societal considerations including fairness, safety, privacy
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7824
Loading