CMStatistics 2017: Start Registration
View Submission - CMStatistics
B0460
Title: On scalable inference with stochastic gradient descent Authors:  Yixin Fang - New Jersey Institute of Technology (United States) [presenting]
Jinfeng Xu - University of Hong Kong (Hong Kong)
Lei Yang - New York University School of Medicine (United States)
Abstract: In many applications involving large data sets, stochastic gradient descent (SGD) provides a scalable way to compute parameter estimates and has gained increasing popularity due to its numerical convenience and memory efficiency. While the asymptotic properties of stochastic gradient descent-based estimators have been established decades ago, statistical inference such as interval estimation remains much unexplored. The traditional resampling method such as the bootstrap is not computationally feasible since it requires to repeatedly draw independent samples from the entire dataset. The plug-in method is not applicable when there are no explicit formulas for the covariance matrix of the estimator or the application comes from an online data setting where the sample arrives sequentially and it may not be necessary or realistic to store the entire dataset. We propose a scalable inferential procedure for stochastic gradient descent, which, upon the arrival of each observation, updates the SGD estimate as well as a large number of randomly perturbed SGD estimates. The proposed method is easy to implement in practice. We establish its theoretical properties in a general model setting which includes generalized linear models and quantile regression as special cases.