Generic and Privacy-free Synthetic Data Generation for Pretraining GANs

Dec 2, 2022

Speakers

About

Transfer learning for GANs successfully improves low-shot generation performance. However, existing studies show that the pretrained model using a single benchmark dataset is not generalized to various datasets. More importantly, the pretrained model can be vulnerable to copyright or privacy risks. To resolve both issues, we propose an effective and unbiased data synthesizer, namely Primitives-PS, inspired by the generic characteristics of natural images. Since Primitives-PS only considers the generic properties of natural images, the images are free from copyright and privacy issues. In addition, the single model pretrained on our dataset can be transferred to various target datasets. Extensive analysis demonstrates that each component of our data synthesizer is effective, and provides insights on the desirable nature of the pretrained model for the transferability of GANs.

Organizer

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow NeurIPS 2022