EcoSta 2023: Start Registration
View Submission - EcoSta2023
A1076
Title: Scalable statistical inference in non-parametric least squares Authors:  Meimei Liu - Virginia Tech (United States) [presenting]
Yun Yang - University of Illinois at Urbana Champaign (United States)
Zuofeng Shang - New Jersey Institute of Technology (United States)
Abstract: Stochastic approximation (SA), such as stochastic gradient descent (SGD), is a powerful and scalable algorithm for solving stochastic optimization problems in large-scale and streaming data settings. An inferential framework for stochastic approximation (SA) in nonparametric least squares problems within a reproducing kernel Hilbert space (RKHS) is developed. An online multiplier bootstrap method is proposed for local inference through pointwise confidence intervals and global inference with simultaneous confidence bands, thus advancing SA-based estimation in nonparametric regression models. The main contributions include the development of a unified framework to derive the non-asymptotic behaviour of the infinite-dimensional stochastic gradient descent (SGD) estimate under the supremum norm, demonstrating the consistency of the multiplier bootstrap method in nonparametric settings. Further, the proposed method is applied to neuroimage data for statistical inference.