Zeroth-Order Stochastic Compositional Gradient Descent: Towards Black-Box Sparse AUC Maximization
Abstract: The area under the ROC curve (AUC) is a key metric for classification tasks, valued for its robustness to class imbalance. Sparse models trained with $\ell_0$ constraints further enhance interpretability and generalization. Building on prior work that reformulates nonlinear AUC maximization as a pointwise compositional optimization problem, we revisit this formulation as the basis for addressing the black-box setting, where only function evaluations are available. A central challenge arises from integrating zeroth-order gradient estimation with hard-thresholding operators in the compositional framework, which has remained unresolved. To overcome this difficulty, we propose the Zeroth-Order Stochastic Compositional Hard-Thresholding (ZO-SCHT) algorithm, which, to the best of our knowledge, is the first method for black-box sparse AUC maximization. We establish that ZO-SCHT achieves linear convergence up to a tolerance bound under a fixed step size. Extensive experiments on both black-box sparse AUC maximization and black-box adversarial attack tasks demonstrate the effectiveness and versatility of our approach.
Submission Number: 1436
Loading