Title: Sinkhorn divergences: Bridging the gap between optimal transport and MMD
Authors: Aude Genevay - MIT (United States) [presenting]
Abstract: Sinkhorn Divergences, based on entropy-regularized OT, were first introduced as a solution to the computational burden of OT. However, this family of losses actually interpolates between OT (no regularization) and MMD (infinite regularization). This interpolation property is also true in terms of sample complexity, and thus regularizing OT breaks its curse of dimensionality. We will illustrate these theoretical claims on a set of learning problems like learning a distribution from samples.