EcoSta 2022: Start Registration
View Submission - EcoSta2022
A0773
Title: Condition of GIC to select the model minimizing KL-loss function in high-dimensional multivariate linear regression Authors:  Ryoya Oda - Hiroshima University (Japan) [presenting]
Hirokazu Yanagihara - Hiroshima University (Japan)
Abstract: The focus is on the variable selection method based on minimizing the generalized information criterion (GIC) for selecting explanatory variables in a normality-assumed multivariate linear regression. The GIC is defined as the sum of -2 times maximum log-likelihood and a penalty term included in a positive parameter. From the viewpoint of the prediction ability, it is often hoped that the model which minimizes a loss function among all candidate models is chosen. Hence, it is important to examine whether or not the GIC has the asymptotic property that the probability of selecting the model minimizing the KL-loss function converges to 1. We call this property consistency. Recently, there has been significant attention in the literature to statistical methods for high-dimensional data. We obtain conditions for consistency of the GIC based on minimizing the KL-loss function under the following high-dimensional asymptotic framework: the sample size tends to infinity and the dimension of response variables divided by the sample size converges to a positive constant within [0,1). Then, using the obtained conditions, we propose a consistent variable selection criterion under the high-dimensional asymptotic framework. Through simulation experiments, it is shown that the probability of selecting the model minimizing the KL-loss function by our proposed criterion is high even when the dimension is large.