Bridging the Gap Between Coulomb GAN and Gradient-regularized WGAN

Dec 2, 2022

Speakers

About

Generative adversarial networks (GANs) are essentially a min-max game between the discriminator and a generator. Coulomb GANs have a closely related formulation where the generator minimizes the potential difference between real (negative) and fake (positive) charge densities, wherein the discriminator approximates a low-dimensional Plummer kernel centered around the samples. Motivated by links between electrostatic potential theory and the Poisson partial differential equation (PDE), we consider the underlying functional optimization in Coulomb GAN and show that the associated discriminator is the optimum of a first-order gradient-regularized Wasserstein GAN (WGAN) cost. Subsequently, we show that, within the regularized WGAN setting, the optimal discriminator is the Green's function to the Poisson PDE, which corresponds to the Coulomb potential. As an alternative to training a discriminator in either WGAN or Coulomb GAN, we demonstrate, by means of synthetic data experiments, that the closed-form implementation of the optimal discriminator leads to a superior performance of the GAN generator.

Organizer

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow NeurIPS 2022