CMStatistics 2021: Start Registration
View Submission - CFE
A1230
Title: Prior input sensitivities of posterior MCMC inference via infinitesimal perturbation analysis with application to VARs Authors:  Liana Jacobi - University Melbourne (Australia) [presenting]
Dan Zhu - Monash University (Australia)
Abstract: A key feature of Bayesian inference is the prior (parameter) dependence of posterior distributions that take high-dimensional integrals that typically require numerical solutions. An efficient numerical approach is introduced for input sensitivity analysis of posterior inference via popular Gibbs Markov Chain Monte Carlo (MCMC) simulation methods. We extend Infinitesimal perturbation analysis (IPA) of the simulation path, widely used in the classical simulation context to assess input sensitivities of stochastic dynamic systems, to computational intensive MCMC dependent sampling. We show that IPA derivatives of the posterior statistics from MCMC inference, based on derivatives of parameter draws, have the desirable asymptotic properties of unbiasedness and consistency. We further recommend the use of automatic differentiation to compute these jacobians efficiently. Hence the approach allows for a comprehensive and exact local sensitivity analysis of MCMC output for all input parameters without requiring analytical expressions (likelihood ratio methods, symbolic differentiation) or the re-running of the algorithm (numerical differentiation). We illustrate the use of our method to assess convergence and prior robustness of inference on model parameters and impulse response functions in an application of Bayesian Vector Autoregression analysis with shrinkage priors for US macroeconomic time-series data.