COMPSTAT 2024: Start Registration
View Submission - COMPSTAT2024
A0270
Title: On regression with convex classes Authors:  Sara van de Geer - ETH Zurich (Switzerland) [presenting]
Abstract: Least squares estimation is considered over a convex class of regression functions that can be well-approximated by linear functions. We assume that the dimension $M(\epsilon)$ needed for a linear $\epsilon$-approximation of the class grows polynomially in $1/\epsilon$, with exponent $W>0$. In that case, the rate of convergence of the least squares estimator is up to log-terms of order $n^{-{2+W \over 2 (1+W) }}$ where $n$ is the number of observations. The result is applied to the case where the class of regression functions the convex hull of $d$-fold products of functions, for example, the class of all$d$-dimensional distribution functions. For design on a grid, the exponent $W$ does not depend on $d$. We connect the results to entropy estimates and show that they can be sharp. The results can also be applied to density estimation problems. When the class of densities is a mixture of a $d$-fold product of densities in a parametric class, the entropy of the class depends on $d$ only in the logarithmic terms.