COMPSTAT 2024: Start Registration
View Submission - COMPSTAT2024
A0494
Title: $L^2$-divergence estimator with better finite sample performance Authors:  Sixiao Zhu - Paris 1 University (France) [presenting]
Alain Celisse - Paris 1 University (France)
Abstract: The problem of estimating the $L^2$ divergence of two continuous probability distributions is studied. A kernel-based estimator is considered where two kernels are employed, providing a finer bias-variance tradeoff for each distribution. Different from the asymptotic regime, where the convergence rate of the whole estimator is dominated by the rougher distribution among the two, in a finite sample regime, the two-kernel framework admits a better oracle estimator than the one-kernel framework. This better oracle is also tractable by a simple model selection procedure, which is shown by providing oracle inequality corresponding to the procedure.