EcoSta 2023: Start Registration
View Submission - EcoSta2023
A0445
Title: Sparse convoluted rank regression in high dimensions Authors:  Le Zhou - Hong Kong Baptist University (Hong Kong) [presenting]
Boxiang Wang - University of Iowa (United States)
Hui Zou - University of Minnesota (United States)
Abstract: High-dimensional sparse penalized rank regression was studied in 2020, and it was shown to enjoy nice theoretical properties. Compared with the least squares, rank regression can substantially gain estimation efficiency while maintaining a minimal relative efficiency of 86.4\%. However, the computation of penalized rank regression can be very challenging for high-dimensional data due to the highly nonsmooth rank regression loss. Convoluted rank regression is proposed, and the sparse penalized convoluted rank regression (CRR) for high-dimensional data is studied. Some interesting asymptotic properties of CRR are proven. Under the same key assumptions for sparse rank regression, the rate of convergence of the $\l_1$-penalized CRR for a tuning-free penalization parameter is established, and the strong oracle property of the folded concave penalized CRR is proven. Further, a high-dimensional Bayesian information criterion is proposed for selecting the penalization parameter in folded concave penalized CRR and its selection consistency is proven. An efficient algorithm is derived for solving sparse convoluted rank regression that scales well with high dimensions. Numerical examples demonstrate the promising performance of the sparse convoluted rank regression over the sparse rank regression. The theoretical and numerical results suggest that sparse convoluted rank regression enjoys the best of both sparse least squares regression and sparse rank regression.