CFE 2020: Start Registration
View Submission - CMStatistics
B0613
Title: Approximate Laplace approximation Authors:  David Rossell - Universitat Pompeu Fabra (Spain) [presenting]
Anirban Bhattacharya - Texas AM University (United States)
Abstract: Bayesian and $L_0$ model selection strategies require conducting an integration or maximization exercise for each candidate model, in order to assign model posterior probabilities/score. When the number of models is large, and there is no closed-form expression for the integral/maximized likelihood, such computations can become cumbersome. We present a simple yet powerful idea based on the Laplace approximation (LA) to an integral. LA uses a quadratic Taylor expansion around the mode of the integrand, which typically has good accuracy, but requires optimization. We propose the approximate Laplace approximation (ALA), which uses a Taylor expansion around the null parameter value. ALA brings significant speed-ups by avoiding optimizations altogether, and the sharing of sufficient statistics shared across models. We prove that ALA provides an approximate inference method equipped with strong model selection properties in the family of non-linear GLMs, attaining comparable rates to exact computation. We also show that when the model is misspecified, the ALA rates can actually be faster than for exact computation, depending on the type of misspecification. We illustrate with examples in linear, logistic and survival regression with non-local priors.