Abstract: Deep Neural Networks (DNNs) are known to be vulnerable to adversarial attacks. Recently, Stochastic Neural Networks (SNNs) have been proposed to enhance adversarial robustness by injecting uncertainty into the models. However, existing SNNs often inspired by intuition and rely on adversarial training, which is computationally costly. To address this issue, we propose a novel SNN called the Weight-based Stochastic Neural Network (WB-SNN), which is based on optimizing an error upper bound of adversarial robustness from the perspective of weight distribution. To the best of our knowledge, we are the first to propose a theoretically guaranteed weight-based stochastic neural network without relying on adversarial training. In comparison to normal adversarial training, our method saves about three times the computation cost. Extensive experiments on various datasets, networks, and adversarial attacks have demonstrated the effectiveness of the proposed method.
Loading