CMStatistics 2019: Start Registration
View Submission - CMStatistics
B0464
Title: Unbiased estimators for linear regression and experimental design Authors:  Michal Derezinski - University of Michigan (United States) [presenting]
Abstract: Finding unbiased estimators for linear regression - where we wish to fit a linear function to a set of noisy measurements - is one of the oldest tasks in statistics. The classical Gauss-Markov theorem shows that the least squares estimator is the optimal solution for this problem under a set of strong assumptions regarding the data and measurement noise. Is it possible to construct an unbiased estimator for general random design linear regression without any assumptions on the measurement noise? We will show that it is possible by applying the least squares estimator to the dataset augmented by a small sample of additional measurements, generated from a certain determinantal point process called volume sampling. The obtained estimator is the first useful unbiased estimator for random design regression, and it can be efficiently constructed in many practical settings. As an example, we will show how this technique can be utilized in the context of A-optimal experimental design, where, given a large set of possible expensive measurements, we wish to select a small number of them to be performed, so as to construct an unbiased estimator with small mean squared error. Finally, we will discuss how these results extend to regularized least squares and Bayesian experimental design.