CMStatistics 2022: Start Registration
View Submission - CMStatistics
B0643
Title: Optimal nonparametric inference with two-scale distributional nearest neighbors Authors:  Emre Demirkaya - University of Tennesse, Knoxville (United States)
Yingying Fan - University of Southern California (United States)
Lan Gao - University of Tennessee Knoxville (United States) [presenting]
Jinchi Lv - University of Southern California (United States)
Patrick Vossler - Standford University (United States)
Jingbo Wang - University of Southern California (United States)
Abstract: The weighted nearest neighbors (WNN) estimator has been popularly used as a flexible and easy-to-implement nonparametric tool for mean regression estimation. The bagging technique is an elegant way to form WNN estimators with weights automatically generated to the nearest neighbors; we name the resulting estimator as the distributional nearest neighbors (DNN) for easy reference. Yet, there is a lack of distributional results for such an estimator, limiting its application to statistical inference. Moreover, when the mean regression function has higher-order smoothness, DNN does not achieve the optimal nonparametric convergence rate, mainly because of the bias issue. We provide an in-depth technical analysis of the DNN, based on which we suggest a bias reduction approach for the DNN estimator by linearly combining two DNN estimators with different subsampling scales, resulting in the novel two-scale DNN (TDNN) estimator. We prove that the two-scale DNN estimator enjoys the optimal nonparametric rate of convergence in estimating the regression function under the fourth-order smoothness condition. We further go beyond estimation and establish asymptotic normality for DNN and two-scale DNN estimators. For the practical implementation, we also provide variance estimators and a distribution estimator using the jackknife and bootstrap techniques for the two-scale DNN.