CFE 2019: Start Registration
View Submission - CMStatistics
Title: The predictive power of kernel principal components regression Authors:  Ben Jones - Cardiff University (United Kingdom) [presenting]
Andreas Artemiou - Cardiff University (United Kingdom)
Bing Li - The Pennsylvania State University (United States)
Abstract: A well-known empirical phenomenon in statistics is that the higher-ranking principal components of a predictor variable tend to have greater squared correlations with a response variable than the lower-ranking ones, even though the extraction procedure is unsupervised. This was originally observed in the classical setting, where it is assumed that a linear model relates the response with the predictor. In this setting, theoretical analyses have proven that, under a uniformity assumption on the regression coefficients or the covariance matrix of the predictor, this tendency holds at population-level. Further studies have established this result in more general settings, including in single-index and conditional independence models. The principal components procedure can be extended, using the kernel trick, to nonlinear directions in the data. The predictive tendency, using the nonlinear components, has also been empirically recognised in this setting. Recent research is detailed which establishes this tendency, in this nonlinear setting, at population-level. The first framework is that of nonparametric regression. The second, much more general, framework is that where the response conditional on the predictor has an arbitrary or random distribution.