B1721
Title: Variational inference and robustness
Authors: Cynthia Rush - Columbia University (United States) [presenting]
Abstract: Variational inference (VI) is a machine learning technique that approximates difficult-to-compute probability densities by using optimization. While VI has been used in numerous applications, it is particularly useful in Bayesian statistics where one wishes to perform statistical inference about unknown parameters through calculations on a posterior density. We will discuss some new ideas about VI and robustness to model misspecification. In particular, we will study alpha-posteriors, which distort standard posterior inference by downweighting the likelihood, and their variational approximations. We will see that such distortions, if tuned appropriately, can outperform standard posterior inference when there is potential parametric model misspecification.