Nov 28, 2022
Speaker · 0 followers
Speaker · 0 followers
Speaker · 0 followers
Speaker · 0 followers
Speaker · 0 followers
We develop a rigorous and general framework for constructing information-theoretic divergences that subsume both f-divergences and integral probability metrics (IPMs), such as the 1-Wasserstein distance. We prove under which assumptions these divergences, hereafter referred to as (f,Γ)-divergences, provide a notion of `distance' between probability measures and show that they can be expressed as a two-stage mass-redistribution/mass-transport process. The (f,Γ)-divergences inherit features from IPMs, such as the ability to compare distributions which are not absolutely continuous, as well as from f-divergences, namely the strict concavity of their variational representations and the ability to control heavy-tailed distributions for particular choices of f. When combined, these features establish a divergence with improved properties for estimation, statistical learning, and uncertainty quantification applications. Using statistical learning as an example, we demonstrate their advantage in training generative adversarial networks (GANs) for heavy-tailed, not-absolutely continuous sample distributions. We also show improved performance and stability over gradient-penalized Wasserstein GAN in image generation.We develop a rigorous and general framework for constructing information-theoretic divergences that subsume both f-divergences and integral probability metrics (IPMs), such as the 1-Wasserstein distance. We prove under which assumptions these divergences, hereafter referred to as (f,Γ)-divergences, provide a notion of `distance' between probability measures and show that they can be expressed as a two-stage mass-redistribution/mass-transport process. The (f,Γ)-divergences inherit features from I…
Account · 954 followers
Professional recording and live streaming, delivered globally.
Presentations on similar topic, category or speaker
Ajay Jaiswal, …
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Huiwen Jia, …
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Aibek Alanov, …
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Joar Skalse, …
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%