EcoSta 2024: Start Registration
View Submission - EcoSta 2025
A0413
Title: Online Bernstein von Mises theorem Authors:  Jeyong Lee - POSTECH (Korea, South) [presenting]
Junhyeok Choi - Pohang University of Science and Technology (Korea, South)
Minwoo Chae - Pohang University of Science and Technology (Korea, South)
Abstract: Online learning is an inferential paradigm in which parameters are learned sequentially from data rather than from a fixed dataset, as in batch learning. Instead of training on a large dataset at once, an online learning algorithm updates parameters incrementally as new data arrive. The assumption is that mini-batches from the entire dataset become available in sequential order. The Bayesian framework, which updates the belief about an unknown parameter after each mini-batch observation, is naturally suited for online learning. As each mini-batch arrives, the posterior distribution is updated based on the current prior and the mini-batch observations, with the updated posterior serving as the prior for the next step. However, unless conjugacy holds, this naive Bayesian approach is rarely computationally tractable. If the model is regular, the updated posterior distribution can be approximated by a normal distribution, as justified by the Bernstein von Mises theorem. A variational approximation is considered at each step, and the frequentist properties of the sequentially updated posterior are investigated at the final step. Under mild assumptions, it is proven that the accumulated approximation error at each step becomes negligible once the mini-batch size exceeds a certain threshold, depending on the dimension of the parameter. Consequently, the difference between the sequentially updated posterior and the full posterior is asymptotically negligible.