CMStatistics 2023: Start Registration
View Submission - CMStatistics
B1042
Title: Strong inductive biases provably prevent harmless interpolation Authors:  Konstantin Donhauser - ETH Zurich (Switzerland) [presenting]
Abstract: Classical wisdom suggests that estimators should avoid fitting noise to achieve good generalization. In contrast, modern overparameterized models can yield small test errors despite interpolating noise, a phenomenon often called benign overfitting or harmless interpolation. The degree to which interpolation is harmless is argued to hinge upon the strength of an estimator's inductive bias, i.e., how heavily the estimator favors solutions with a certain structure. The main insight is that while strong inductive biases prevent harmless interpolation, weak inductive biases can even require fitting noise to generalize well. The claim is supported by theoretical results for minimum-norm/maximum-margin interpolators and empirical results for simple neural networks.