EcoSta 2018: Registration
View Submission - EcoSta2018
A0154
Title: On choosing mixture components via non-local priors Authors:  Mark Steel - University of Warwick (United Kingdom) [presenting]
David Rossell - Universitat Pompeu Fabra (Spain)
Jairo Fuquene - UC Davis (United States)
Abstract: Choosing the number of mixture components remains a central but elusive challenge. Traditional model selection criteria can be either overly liberal or conservative when enforcing parsimony. They may also result in poorly separated components of limited practical use. Non-local priors (NLPs) are a family of distributions that encourage parsimony by enforcing a separation between the models under consideration. We formalize NLPs in the context of mixtures and show how they lead to well-separated components that are interpretable as distinct subpopulations. We suggest default prior settings, give a theoretical characterization of the sparsity induced by NLPs, derive tractable expressions and propose simple algorithms to obtain the integrated likelihood and parameter estimates. The framework is generic and we fully develop multivariate Normal, Binomial and product Binomial mixtures based on a family of exchangeable moment priors. Our results show a serious lack of sensitivity of the Bayesian information criterion (BIC) and insufficient parsimony of the AIC and a local prior counterpart to our formulation. The singular BIC behaved like the BIC in some examples and the AIC in others. We also offer comparisons to overfitted and repulsive overfitted mixtures, the performance of which depended on the choice of prior parameters. The number of components inferred under NLPs was closer to the true number (when known)and remained robust to prior settings.