EcoSta 2023: Start Registration
View Submission - EcoSta2023
A0324
Title: Statistical inference with stochastic gradient methods under $\phi$-mixing data Authors:  Ruiqi Liu - Texas Tech University (United States) [presenting]
Xi Chen - New York University (United States)
Zuofeng Shang - New Jersey Institute of Technology (United States)
Abstract: Stochastic gradient descent (SGD) is a scalable and memory-efficient optimization algorithm for large datasets and stream data, which has drawn a great deal of attention and popularity. The applications of SGD-based estimators to statistical inference, such as interval estimation, have also achieved great success. However, most of the related works are based on i.i.d. observations or Markov chains. When the observations come from a mixing time series, how to conduct valid statistical inference remains unexplored. The general correlation among observations imposes a challenge on interval estimation. Most existing methods may ignore this correlation and lead to invalid confidence intervals. A mini-batch SGD estimator is proposed for statistical inference when the data is $\phi$-mixing. The confidence intervals are constructed using an associated mini-batch bootstrap SGD procedure. Using the "independent block" trick from \cite{yu1994rates}, it is shown that the proposed estimator is asymptotically normal, and the bootstrap procedure can effectively approximate its limiting distribution. The proposed method is memory-efficient and easy to implement in practice. Simulation studies on synthetic data and an application to a real-world dataset confirm the theory.