CFE-CMStatistics 2024: Start Registration
View Submission - CFECMStatistics2024
A1433
Title: Adaptive ridge regression and fractional degrees of freedom Authors:  Keith Knight - University of Toronto (Canada) [presenting]
Abstract: In classical regression model selection methods (such as forward and backwards selection, Mallows $C_p$, adjusted $R^2$, etc.), a predictor is either in the model or out of the model in the sense that its estimated parameter is assumed to be either a least squares estimate or 0, respectively. Shrinkage methods (such as ridge regression and the LASSO) can be viewed as relaxations of the classical model selection methods in the sense that they are able to shrink smaller least squares estimates close to 0 or exactly 0. An interesting property of ridge regression is that for given values of the ridge parameters, the fractional contribution (between 0 and 1) of each predictor can be defined in the model. It is considered to define the ridge parameters in terms of specified fractional contributions for each predictor.