CFE 2019: Start Registration
View Submission - CMStatistics
B0436
Title: Pitman closeness domination in predictive density estimation for two ordered normal means under alpha-divergence loss Authors:  Genso-Y.-T. Watanabe-Chang - Mejiro University (Japan) [presenting]
Nobuo Shinozaki - Keio University (Japan)
William Strawderman - Rutgers University (United States)
Abstract: Pitman closeness domination in predictive density estimation problems is considered when the underlying loss metric is the alpha-divergence, D(alpha). The considered underlying distributions are normal location-scale models, including the distribution of the observables, the distribution of the variable whose density is to be predicted, and the estimated predictive density which will be taken to be of the plug-in type. The scales may be known or unknown. A general expression for the alpha-divergence loss in this set-up has been previously derived. It has been shown that it is a concave monotone function of the quadratic loss, and also a function of the variances (predicand, and plug-in). We demonstrate the D(alpha)-Pitman closeness domination of certain plug-in predictive densities over others for the entire class of metrics simultaneously when modified Pitman closeness domination holds in the related problem of estimating the mean. We also establish D(alpha)-Pitman closeness results for certain generalized Bayesian (best invariant) predictive density estimators. Examples of D(alpha)-Pitman closeness domination presented relate to the problem of estimating the predictive density of the variable with the larger mean. We also consider the case of two ordered normal means with a known covariance matrix.