Keywords: Diversity metric, Jacobians Matrix, singular value decomposition, Ensemble adversarial robutness
TL;DR: Through the optimization theory analysis under Lagrange conditions, the SVD of the Jacobian matrix characterize the sub-models' diversity and further used to improve ensemble robutness
Abstract: Transferability of adversarial samples under different CNN models is not only one of the metrics indicators for evaluating the performance of adversarial examples, but also an important research direction in the defense of adversarial examples. Diversified models prevent black-box attacks relying on a specific alternative model. Meanwhile, recent research has revealed that adversarial transferability across sub-models may be used to abstractly express the diversity needs of sub-models under ensemble robustness. Because there was no mathematical description for this diversity in earlier studies, the difference in model architecture or model output was employed as an empirical standard in the assessment, with the model loss as the optimization aim. This paper proposes corresponding assessment criteria and provides a more accurate mathematical explanation of the transferability of adversarial samples between models based on the singular value decomposition (SVD) of data-dependent Jacobians. A new constraints norm is proposed in model training based on these criteria to isolate adversarial transferability without any prior knowledge of adversarial samples. Under the novel condition of high-dimensional inputs in training process, the model attribute extraction from dimensionality reduction of Jacobians makes evaluation metric and training norm more effective. Experiments have proved that the proposed metric is highly correlated with the actual robustness of transferability between sub-models and the model trained based on this constraint norm improve the adversarial robustness of ensemble.