Spatially Decomposed Hinge Adversarial Loss by Local Gradient AmplifierDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Withdrawn SubmissionReaders: Everyone
Abstract: Generative Adversarial Networks (GANs) have achieved large attention and great success in various research areas, but it still suffers from training instability. Recently hinge adversarial loss for GAN is proposed that incorporates the SVM margins where real and fake samples falling within the margins contribute to the loss calculation. In a generator training step, however, fake samples outside of the margins that partially include unrealistic local patterns are ignored. In this work, we propose local gradient amplifier(LGA) which realizes spatially decomposed hinge adversarial loss for improved generator training. Spatially decomposed hinge adversarial loss applies different margins for different spatial regions extending overall margin space toward all fake samples asymmetrically. Our proposed method is evaluated on several public benchmark data sets compared to state of the art methods showing outstanding stability in training GANs.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Reviewed Version (pdf): https://openreview.net/references/pdf?id=2WXTLeTY2N
5 Replies

Loading