CFE-CMStatistics 2025: Start Registration
View Submission - CFE-CMStatistics 2025
A1122
Title: Understanding the effect of input space dimension on kernel learning rates Authors:  Enrico Bignozzi - USI (Switzerland) [presenting]
Abstract: In recent years, the statistical-learning literature has made significant strides in non-parametric learning. These advances, however, have often run in parallel with the econometric tradition, which typically accounts for temporal dependence and explicitly links input dimensionality to the rate of risk convergence. In many econometric settings, asymptotic convergence rates depend directly on the dimensionality of the regressors, whereas most machine-learning work introduces alternative complexity measures and assumes i.i.d. data. The aim is to close that gap by deriving convergence rates for kernel ridge regression that make the dependence on the number of input variables explicit and accommodate temporally dependent data. Smoothness conditions are imposed on the regression target and relate the resulting rates to the eigenvalue decay of the kernel matrix, which is itself tied to the dimensionality of the inputs, yielding new asymptotic bounds that quantify the curse of dimensionality in non-i.i.d. scenarios.