CMStatistics 2019: Start Registration
View Submission - CMStatistics
B1178
Title: Decoupling shrinkage and selection in Gaussian linear factor analysis Authors:  Hedibert Lopes - INSPER (Brazil) [presenting]
Henrique Bolfarine - University of Sao Paulo (Brazil)
Carlos Carvalho - The University of Texas at Austin (United States)
Jared Murray - University of Texas at Austin (United States)
Abstract: Sparsity-inducing priors has been a relevant option in variable selection in a variety of statistical models. Despite the interpretability and ease of application, there are still divergences in determining whether a parameter a posteriori is really null. In this context, the decoupling shrinkage and selection (DSS) approach appears as an alternative that preserves the a posteriori information while providing an optimal selection in the set of variables. We extend the DSS methodology for the Gaussian linear factor analysis model in order to obtain a sparse loadings matrix, reducing to zero the parameters that are not relevant to the model. To perform such selection, we introduce a penalized loss function, a post inference procedure that relies on a penalized predictive version of the expectation-maximization (EM) algorithm, and a graphical summary. The findings are illustrated with simulations and two applications, the first in psychometrics and the latter in denoising of handwritten data. The standard normal and point mass priors were used, resulting in significantly different levels of sparsity in the recovered loading matrix.