CMStatistics 2023: Start Registration
View Submission - CMStatistics
B0955
Title: On lower bounds for the bias-variance trade-off Authors:  Alexis Derumigny - Delft University of Technology (Netherlands) [presenting]
Johannes Schmidt-Hieber - University of Twente (Netherlands)
Abstract: It is a common phenomenon that for high-dimensional and nonparametric statistical models, rate-optimal estimators balance squared bias and variance. Although this balancing is widely observed, little is known about whether methods exist that could avoid the trade-off between bias and variance. A general strategy is proposed to obtain lower bounds on the variance of any estimator with a bias smaller than a prespecified bound. This shows to which extent the bias-variance trade-off is unavoidable and allows for quantification of the loss of performance for methods that do not obey it. The approach is based on a number of abstract lower bounds for the variance involving the change of expectation with respect to different probability measures as well as information measures such as the Kullback-Leibler or $\chi^2$-divergence. Some of these inequalities rely on a new concept of information matrices. In the second part, the abstract lower bounds are applied to several statistical models including the Gaussian white noise model, a boundary estimation problem, the Gaussian sequence model and the high-dimensional linear regression model. For these specific statistical applications, different types of bias-variance trade-offs occur that vary considerably in their strength. For the trade-off between integrated squared bias and integrated variance in the Gaussian white noise model, the combination of the general strategy for lower bounds with a reduction technique is proposed.