CMStatistics 2023: Start Registration
View Submission - CMStatistics
B0298
Title: Kernel cumulants Authors:  Patric Bonnier - University of Oxford (United Kingdom)
Harald Oberhauser - University of Oxford (United Kingdom)
Zoltan Szabo - LSE (United Kingdom) [presenting]
Abstract: Maximum mean discrepancy (MMD, also called energy distance) and Hilbert-Schmidt independence criterion (HSIC, a.k.a. distance covariance) rely on the mean embedding of probability distributions and are among the most successful approaches in machine learning and statistics to quantify the difference and the independence of random variables, respectively. Higher-order variants of MMD and HSIC are presented by extending the notion of cumulants to reproducing kernel Hilbert spaces. The resulting kernelized cumulants have various benefits: (i) they are able to characterize the equality of distributions and independence under very mild conditions, (ii) they are easy to estimate with minimal computational overhead compared to their degree one (MMD and HSIC) counterparts, (iii) they achieve improved power when applied in two-sample and independence testing for environmental and traffic data analysis.