A1350
Title: On MCMC mixing for predictive inference under unidentified transformation models
Authors: Catherine Liu - The Hong Kong Polytechnic University (Hong Kong) [presenting]
Abstract: Reliable Bayesian predictive inference has long been an open problem under unidentified transformation models, since the Markov chain Monte Carlo (MCMC) chains of posterior predictive distribution (PPD) values are generally poorly mixed. The aim is to address the poorly mixed PPD value chains under unidentified transformation models through an adaptive scheme for prior adjustment. Specifically, a conception of sufficient informativeness is originated, which explicitly quantifies the information level provided by nonparametric priors, and assesses MCMC mixing by comparison with the within-chain MCMC variance. The prior information level is formulated by a set of hyperparameters induced from the nonparametric prior elicitation with an analytic expression, which is guaranteed by asymptotic theory for the posterior variance under unidentified transformation models. The analytic prior information level consequently drives a hyperparameter tuning procedure to achieve MCMC mixing. The proposed method is general enough to cover various data domains through a multiplicative error working model. Comprehensive simulations and real-world data analysis demonstrate that the method successfully achieves MCMC mixing and outperforms state-of-the-art competitors in predictive capability.