Curvature Dynamic Black-box Attack: revisiting adversarial robustness via dynamic curvature estimation

Published: 23 Sept 2025, Last Modified: 27 Nov 2025NeurReps 2025 ProceedingsEveryoneRevisionsBibTeXCC BY 4.0
Keywords: adversarial attacks, adversarial robustness, decision boundary curvature
TL;DR: Proposed a curvature based black-box attack to estimate decision boundary curvature and established a link between curvature and robustness.
Abstract: Adversarial attack reveals the vulnerability of deep learning models. It is assumed that high curvature may give rise to rough decision boundary and thus result in less robust models. However, the most commonly used \textit{curvature} is the curvature of loss function, scores or other parameters from within the model as opposed to decision boundary curvature, since the former can be relatively easily formed using second order derivative. In this paper, we propose a new query-efficient method, dynamic curvature estimation (DCE), to estimate the decision boundary curvature in a black-box setting. Our approach is based on CGBA, a black-box adversarial attack. By performing DCE on a wide range of classifiers, we discovered, statistically, a connection between decision boundary curvature and adversarial robustness. We also propose a new attack method, curvature dynamic black-box attack (CDBA) with improved performance using the estimated curvature.
Poster Pdf: pdf
Submission Number: 29
Loading