Fast Adversarial Training with Dynamic Batch-level Attack ControlDownload PDFOpen Website

Published: 01 Jan 2023, Last Modified: 29 Oct 2023DAC 2023Readers: Everyone
Abstract: Despite the fact that adversarial training provides an effective protection against adversarial attacks, it suffers from a huge computational overhead. To mitigate the overhead, we propose DBAC, a fast adversarial training with dynamic batch-level attack control. Based on a prior study where attack strength should gradually grow throughout the training, we control the number of samples attacked per batch for better throughput. Additionally, we collect samples from multiple batches to form a pseudo-batch and attack them simultaneously for higher GPU utilization. We implement DBAC using PyTorch to show its superior throughput with similar robust accuracy compared to the prior art.
0 Replies

Loading