CFE-CMStatistics 2024: Start Registration
View Submission - CFECMStatistics2024
A0643
Title: Online inference for stochastic gradient descent with dropout regularization Authors:  Jiaqi Li - University of Chicago (United States) [presenting]
Johannes Schmidt-Hieber - University of Twente (Netherlands)
Wei Biao Wu - University of Chicago (United States)
Abstract: An online inference method is proposed for the stochastic gradient descent (SGD) iterates with dropout regularization in linear regression. Specifically, the geometric-moment contraction (GMC) is established for constant step-size SGD dropout iterates to show the existence of a unique stationary solution to the dropout recursive function. By the GMC property, quenched central limit theorems (CLT) are provided for the difference between dropout and $\ell^2$-regularized iterates, regardless of the fixed initial points. The CLT for the difference between the Ruppert-Polyak averaged SGD (ASGD) with dropout and $\ell^2$-regularized iterates is also presented. Based on these asymptotic normality results, an online estimator is further introduced for the long-run covariance matrix of ASGD dropout to facilitate inference recursively with efficiency in computational time and memory. Numerical experiments also demonstrated that the proposed confidence intervals for ASGD dropouts can achieve the desired asymptotic coverage probability.