SGAT: Snapshot-guided adversarial training of neural networks

Published: 01 Jan 2023, Last Modified: 19 Jun 2024Neurocomputing 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•We propose a new accumulated training method named SGAT, which utilizes ensembles of earlier model snapshots to boost subsequent training iterations via an adversarial learning strategy. Compared with the traditional ensembles and multiple model distillations, there is no increase of training and inference costs in SGAT.•SGAT is a general training framework and can be applied to the existing neural networks without requiring structure modification. We have conducted extensive experiments on different network structures and benchmark datasets, and our results show that SGAT significantly improves test accuracy, as compared with the existing method.•We also provide an in-depth analysis of each component of our proposed method, and show how it works by conducting different ablation studies and visualizations.
Loading