Training GANs with Stronger Augmentations via Contrastive Discriminator

May 3, 2021

Speakers

About

Recent works in Generative Adversarial Networks (GANs) are actively revisiting various data augmentation techniques as an effective way to prevent discriminator overfitting. It is still unclear, however, that which augmentations could actually improve GANs, and in particular, how to apply a wider range of augmentations in training. In this paper, we propose a novel way to address these questions by incorporating a recent contrastive representation learning scheme into the discriminator, coined ContraD. This "fusion" enables discriminators to work with much stronger augmentations without catastrophic forgetting, which can significantly improve GAN training. Even better, we observe that the contrastive learning itself also benefits from GAN training, i.e., keeping discriminative features between real and fake samples, suggesting a strong coherence between the two worlds: a good contrastive representation is also good for GAN discriminators, and vice versa. Our experimental results show that GAN with ContraD consistently improves FID scores compared to other recent techniques using data augmentations, still maintaining highly discriminative features in the discriminator in terms of the linear evaluation. Finally, as a byproduct, we show that our GANs trained in an unsupervised manner (without labels) can induce many conditional generative models via a simple latent sampling, leveraging the learned features of ContraD.

Organizer

Categories

About ICLR 2021

The International Conference on Learning Representations (ICLR) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence called representation learning, but generally referred to as deep learning. ICLR is globally renowned for presenting and publishing cutting-edge research on all aspects of deep learning used in the fields of artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, text understanding, gaming, and robotics.

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 1 viewers voted for saving the presentation to eternal vault which is 0.1%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow ICLR 2021