A1212
Title: Wasserstein-Cramer-Rao inequality and robustness
Authors: Takeru Matsuda - University of Tokyo & RIKEN Center for Brain Science (Japan) [presenting]
Abstract: Whereas the Kullback-Leibler divergence plays a central role in statistical inference and information geometry, the Wasserstein distance induces another geometric structure in statistical models through optimal transport. Recently, a Wasserstein counterpart of the Cramer-Rao inequality has been developed, in which the Wasserstein information matrix (Otto metric) appears instead of the Fisher information matrix. A statistical implication of the Wasserstein-Cramer-Rao inequality is given from the viewpoint of the robustness of an estimator. A condition for an estimator to attain the Wasserstein-Cramer-Rao lower bound is also derived, which is related to a Wasserstein counterpart of the one-parameter exponential family.