Self-Supervised Model Training and Selection for Disentangling GANs

Jul 12, 2020

Sprecher:innen

Über

Standard deep generative models have latent codes that can be arbitrarily rotated, and a specific coordinate has no meaning. For manipulation and exploration of the samples, we seek a disentangled latent code where each coordinate is associated with a distinct property of the target distribution. Recent advances has been dominated by Variational Autoencoder (VAE)-based methods, while training disentangled generative adversarial networks (GANs) remains challenging. To this end, we make two contributions: a novel approach for training disentangled GANs and a novel approach for selecting the best disentangled model. First, we propose a regularizer that achieves higher disentanglement scores than state-of-the-art VAE- and GAN-based approaches. This contrastive regularizer is inspired by a natural notion of disentanglement: latent traversal. Latent traversal refers to generating images by varying one latent code while fixing the rest. We turn this intuition into a regularizer by adding a discriminator that detects how the latent codes are coupled together, in paired examples. Next, one major weakness of all disentanglement benchmark tests is that all reported scores are based on hyperparameters tuned with a predefined disentangled representations on synthetic datasets. This is neither fair nor realistic, as one can arbitrarily improve the performance with more hyperparameter tuning and real datasets do not come with such supervision. We propose an unsupervised model selection scheme based on medoids. Numerical experiments confirm that thus selected models improve upon the state-of-the-art models selected with supervised hyperparameter tuning.

Organisator

Kategorien

Über ICML 2020

The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

Präsentation speichern

Soll diese Präsentation für 1000 Jahre gespeichert werden?

Wie speichern wir Präsentationen?

Ewigspeicher-Fortschrittswert: 0 = 0.0%

Freigeben

Empfohlene Videos

Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

Interessiert an Vorträgen wie diesem? ICML 2020 folgen