CMStatistics 2021: Start Registration
View Submission - CMStatistics
B0294
Title: Computationally efficient penalized quantile regression Authors:  Ben Sherwood - University of Kansas (United States) [presenting]
Abstract: Quantile regression directly models a conditional quantile. Penalized quantile regression constrains the regression coefficients similar to penalized mean regression. Quantile regression with a lasso penalty can be reframed as a quantile regression problem with augmented data and, therefore, can be formulated as a linear programming problem. If a group lasso penalty is used, then it becomes a second-order cone programming problem. These approaches become computationally burdensome for large values of $n$ or $p$. Using a Huber approximation to the quantile function allows for the use of computationally efficient algorithms that require a differentiable loss function that can be implemented for both penalties. These algorithms then can be used as the backbones for implanting penalized quantile regression with other penalties such as Adaptive Lasso, SCAD, MCP and group versions of these penalties.