COMPSTAT 2024: Start Registration
View Submission - COMPSTAT2024
A0478
Title: Noise and overfitting: A new perspective on the predictive performance of a linear model Authors:  Insha Ullah - Australian National University (Australia) [presenting]
Alan Welsh - the Australian National University (Australia)
Abstract: Traditionally, the bias-variance trade-off has guided model selection in under-parameterized regimes, with the belief that overparameterization leads to overfitting and poor generalization. However, recent studies have uncovered the double descent curve, where test error surprisingly decreases in overparameterized models, challenging this framework. The aim is to examine the counterintuitive benefits of overfitting in linear models, investigating how noise from predictors or observations affects prediction accuracy. This exploration is crucial, as irrelevant variables are common in practical applications yet often overlooked in discussions on the double descent curve and regularization techniques like ridge regularization. The findings explain the double descent curve mechanics and suggest that overfitting can enhance prediction accuracy under certain conditions. Recent research has shown that minimum norm least squares estimation performs shrinkage in the presence of irrelevant predictors and tends to outperform ridge regularization with a positive ridge penalty in terms of prediction accuracy. Empirical evidence also suggests that the optimal ridge penalty may be zero or negative, challenging standard practice. The analysis demonstrates the advantages of a negative ridge penalty, highlighting the role of noise in model performance.