Title: Generalized variational inference
Authors: Jack Jewson - Universitat Pompeu Fabra and Barcelona Graduate School of Economics (Spain) [presenting]
Abstract: Bayesian inference is viewed as an optimisation problem. This is commonly associated with Variational Inference (VI) which many consider to be unprincipled and ad hoc. In fact, VI can be shown to produce the optimal posterior beliefs within the constrained family Q according to the objective function specified by Bayes rule. We use this observation to further generalise variational inference. We define optimal posterior beliefs using a triple: $Q$ the family of admissible distributions to characterise posterior beliefs; $l(\theta, x)$ the loss function connecting observed data to the parameter of interest for the analysis; $D$ a divergence regularising the optimal posterior beliefs towards the prior. We demonstrate how changing $l(\theta, x)$ can lead to inference that is automatically robust to outliers while changing $D$ impacts the posterior uncertainty quantification, allowing us to improve the accuracy of estimates of marginal posterior uncertainty and produce posteriors that are less sensitive to prior specifications. Formalising variational inference in this way allows us to improve transparency and performance over the myriad of approximate inference methods which attempt to minimise different divergences between the approximate and exact posteriors.