EcoSta 2023: Start Registration
View Submission - EcoSta2023
A0728
Title: A minibatch Gibbs sampler for scalable large-scale Bayesian inference on latent variable models Authors:  Dongrong Li - The Chinese University of Hong Kong (Hong Kong) [presenting]
Abstract: Efficient and scalable Markov Chain Monte Carlo (MCMC) algorithms are essential for modern Bayesian computation since evaluating the joint density on the full dataset in each iteration is computationally prohibitive in the big-data era. Mini batching has emerged as a strategy to tackle this problem, but existing methods require non-trivial upper or lower bounds of the joint density or heavily rely on gradient evaluation. A novel minibatch Gibbs sampler for large-scale Bayesian latent variable models is proposed, which can efficiently sample from an approximate posterior density. The sampler uses the variable splitting technique and introduces auxiliary variables to ensure efficient mini-batching at each iteration. It's flexible, can handle any Metropolis proposal, and doesn't require non-trivial upper or lower bounds of the joint density. It is shown that the approximate density can be arbitrarily close to the true posterior asymptotically and establish explicit non-asymptotic error and mixing bounds to theoretically guarantee the convergence rates. The sampler's performance on synthetic and real data is evaluated, demonstrating its advantages over existing algorithms. The proposed minibatch Gibbs sampler offers a flexible, efficient, and scalable solution for large-scale Bayesian latent variable models and has the potential to advance modern Bayesian computation.