A0155
Title: Fast and accurate variational inference for models with many latent variables
Authors: Michael Smith - University of Melbourne (Australia) [presenting]
Ruben Loaiza-Maya - Monash University (Australia)
David Nott - National University of Singapore (Singapore)
Peter Danaher - Monash University (Austria)
Abstract: Models with a large number of latent variables are often used to utilize the information in big or complex data, but can be difficult to estimate. Variational inference methods provide an attractive solution. These methods use an approximation to the posterior density, yet for large latent variable models, existing choices can be inaccurate or slow to calibrate for large latent variable models. We propose a family of tractable variational approximations that are more accurate and faster to calibrate for this case. It combines a parsimonious approximation for the parameter posterior with the exact conditional posterior of the latent variables. We derive a simplified expression for the re-parameterization gradient of the variational lower bound, which is the main ingredient of optimization algorithms used for calibration. The implementation only requires exact or approximate generation from the conditional posterior of the latent variables, rather than the computation of their density. In effect, our method provides a new way to employ Markov chain Monte Carlo (MCMC) within variational inference.