CMStatistics 2022: Start Registration
View Submission - CMStatistics
B1131
Title: Nonparametric empirical bayes prediction in mixed models Authors:  Trambak Banerjee - University of Kansas (United States) [presenting]
Padma Sharma - Federal Reserve Bank of Kansas City (United States)
Abstract: Mixed models are classical tools in statistics for modeling repeated data on subjects, such as data on patients, customers or firms collected over time. These models extend conventional linear models to include latent parameters, called random effects, that capture between-subject variation and accommodate dependence within the repeated measurements of a subject. Traditionally, predictions in mixed models are conducted by assuming that the random effects have a zero mean Normal distribution, which leads to the Best Linear Unbiased Predictor (BLUP) of the random effects in these models. However, such a distributional assumption on the random effects is restrictive and may lead to inefficient predictions, especially when the true random effect distribution is far from Normal. Here, we discuss a novel framework, EBPred, for empirical Bayes prediction in mixed models. The predictions from EBPred rely on the Best Predictor (BP) of the random effects, which are constructed without any parametric assumption on the distribution of the random effects. We develop theory to show that the corresponding predictions from EBPred are asymptotically optimal in terms of mean squared error for prediction. Extensive simulation studies demonstrate that EBPred outperforms existing predictive rules in mixed models and the efficiency gain is substantial in many settings.