Sinkhorn Divergences: Bridging the gap between Optimal Transport and MMD

Dec 13, 2019

Speakers

About

Sinkhorn Divergences, based on entropy-regularized OT, were first introduced by Cuturi in 2013 as a solution to the computational burden of OT. However, this family of losses actually interpolates between OT (no regularization) and MMD (infinite regularization). This interpolation property is also true in terms of sample complexity, and thus regularizing OT breaks its curse of dimension. We will illustrate these theoretical claims on a set of learning problems like learning a distribution from samples.

Organizer

Categories

About NIPS 2019

Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow NIPS 2019