A1236
Title: Statistical and computational trade-offs in variational inference: A case study in inferential model selection
Authors: Yixin Wang - University of Michigan (United States) [presenting]
Abstract: Variational inference has recently emerged as a popular alternative to the classical Markov chain Monte Carlo (MCMC) in large-scale Bayesian inference. The core idea is to trade statistical accuracy for computational efficiency. The purpose is to study these statistical and computational trade-offs in variational inference via a case study in inferential model selection. Focusing on Gaussian inferential models (or variational approximating families) with diagonal plus low-rank precision matrices, a theoretical study of the trade-offs is initiated in two aspects: Bayesian posterior inference error and frequentist uncertainty quantification error. From the Bayesian posterior inference perspective, the error of the variational posterior relative to the exact posterior is characterized. It is proven that, given a fixed computation budget, a lower-rank inferential model produces variational posteriors with a higher statistical approximation error but a lower computational error; it reduces variance in stochastic optimization and, in turn, accelerates convergence. From the frequentist uncertainty quantification perspective, the precision matrix of the variational posterior is considered an uncertainty estimate, which involves an additional statistical error originating from the sampling uncertainty of the data.