Dec 13, 2019
Sinkhorn Divergences, based on entropy-regularized OT, were first introduced by Cuturi in 2013 as a solution to the computational burden of OT. However, this family of losses actually interpolates between OT (no regularization) and MMD (infinite regularization). This interpolation property is also true in terms of sample complexity, and thus regularizing OT breaks its curse of dimension. We will illustrate these theoretical claims on a set of learning problems like learning a distribution from samples.
Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.
Professional recording and live streaming, delivered globally.
Presentations on similar topic, category or speaker