CMStatistics 2021: Start Registration
View Submission - CMStatistics
B1302
Title: Retire: Robustified expectile regression in high dimensions Authors:  Rebeka Man - University of Michigan (United States)
Kean Ming Tan - University of Michigan (United States) [presenting]
Zian Wang - University of California San Diego (United States)
Wenxin Zhou - University of California San Diego (United States)
Abstract: High-dimensional data can often display heterogeneity due to heteroscedastic variance or inhomogeneous covariate effects. Penalized quantile and expectile regression methods offer useful tools to detect heteroscedasticity in high-dimensional data. The former is computationally challenging due to the non-smooth nature of the check loss, and the latter is sensitive to heavy-tailed error distributions. We propose and study penalized robustified expectile regression (retire) in high dimensions, with a focus on concave regularization which reduces the estimation bias from $l_1$-penalization and leads to oracle properties. Theoretically, we establish the statistical properties of the solution path of iteratively reweighted $l_1$-penalized retire estimation, adapted from the local linear approximation algorithm for folded concave regularization. Under a mild minimum signal strength condition, we show that after as many as $\log{\log(d)}$ iterations the final estimator enjoys the oracle convergence rate. At each iteration, the weighted $l_1$-penalized convex program can be efficiently solved by a semismooth Newton coordinate descent algorithm. Numerical studies demonstrate the competitive performance of the proposed procedure compared with either non-robust or quantile regression-based alternatives.