A0915
Title: Kernel debiased plug-in estimation
Authors: Ivana Malenica - Harvard University (United States) [presenting]
Abstract: Modern estimation methods rely on the plug-in principle, which substitutes unknown parameters of the underlying data-generating process with estimated empirical counterparts. Flexible machine learning (ML) estimation methods have further exploited the plug-in approach. The use of highly adaptive, complex ML algorithms, however, induces plug-in bias (first-order bias) that impacts the downstream estimate. Traditional methods addressing this sub-optimal bias-variance trade-off rely on the efficient influence function (EIF) of the target parameter. When estimating multiple target parameters, these methods require debiasing the nuisance parameter multiple times using the corresponding EIFs, posing analytical and computational challenges. The targeted maximum likelihood estimation framework is leveraged to propose a novel method named kernel debiased plug-in estimation (KDPE). KDPE refines an initial estimate through regularized likelihood maximization steps, employing a nonparametric model based on reproducing kernel Hilbert spaces. It is shown that KDPE (i) simultaneously debiases all pathwise differentiable target parameters that satisfy our regularity conditions, (ii) does not require the EIF for implementation, and (iii) remains computationally tractable. The use of KDPE is numerically illustrated, and the theoretical results are validated.