Towards Faster and Stabilized GAN Training for High-fidelity Few-shot Image SynthesisDownload PDF

Published: 12 Jan 2021, Last Modified: 22 Oct 2023ICLR 2021 PosterReaders: Everyone
Keywords: deep learning, generative model, image synthesis, few-shot learning, generative adversarial network, self-supervised learning, unsupervised learning
Abstract: Training Generative Adversarial Networks (GAN) on high-fidelity images usually requires large-scale GPU-clusters and a vast number of training images. In this paper, we study the few-shot image synthesis task for GAN with minimum computing cost. We propose a light-weight GAN structure that gains superior quality on 1024^2 resolution. Notably, the model converges from scratch with just a few hours of training on a single RTX-2080 GPU, and has a consistent performance, even with less than 100 training samples. Two technique designs constitute our work, a skip-layer channel-wise excitation module and a self-supervised discriminator trained as a feature-encoder. With thirteen datasets covering a wide variety of image domains (The datasets and code are available at https://github.com/odegeasslbc/FastGAN-pytorch), we show our model's superior performance compared to the state-of-the-art StyleGAN2, when data and computing budget are limited.
One-sentence Summary: A computational-efficient GAN for few-shot hi-fi image dataset (converge on single gpu with few hours' training, on 1024 resolution sub-hundred images).
Supplementary Material: zip
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Code: [![github](/images/github_icon.svg) odegeasslbc/FastGAN-pytorch](https://github.com/odegeasslbc/FastGAN-pytorch) + [![Papers with Code](/images/pwc_icon.svg) 6 community implementations](https://paperswithcode.com/paper/?openreview=1Fqg133qRaI)
Data: [FFHQ](https://paperswithcode.com/dataset/ffhq), [Perceptual Similarity](https://paperswithcode.com/dataset/perceptual-similarity)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 3 code implementations](https://www.catalyzex.com/paper/arxiv:2101.04775/code)
24 Replies

Loading