A0982
Title: Variance-aware estimation of kernel mean embedding
Authors: Geoffrey Wolfer - Waseda University (Japan) [presenting]
Pierre Alquier - ESSEC Business School (Singapore)
Abstract: An important feature of kernel mean embeddings (KME) is that the rate of convergence of the empirical KME to the true distribution KME can be bounded independently of the dimension of the space, properties of the distribution, and smoothness features of the kernel. It is shown how to speed up convergence by leveraging variance information in the reproducing kernel Hilbert space. Furthermore, it is shown that even when such information is a priori unknown, it can be efficiently estimated from the data, recovering the desiderata of a distribution agnostic bound that enjoys acceleration in fortuitous settings. The results are further extended from independent data to stationary mixing sequences and the methods are illustrated in the context of hypothesis testing and robust parametric estimation.