EcoSta 2024: Start Registration
View Submission - EcoSta2024
A0469
Title: Sparse sufficient dimension reduction via penalized gradient learning for composite quantile regressions Authors:  Seogyoung Lee - Korea University (Korea, South) [presenting]
Seung Jun Shin - Korea University (Korea, South)
Abstract: It is essential to identify informative features in machine learning. Among many others, sufficient dimension reduction (SDR) has gained great attention due to its promising performance in extracting a small number of features in both regression and classification problems. SDR, however, often suffers from poor interpretability, especially in high dimensions, since all predictors have non-zero loading. A novel sparse SDR method, called sparse-qOPG, is proposed by extending the idea of qOPG. The qOPG elegantly employs a series of quantile regressions with different quantile levels to conduct SDR, i.e., estimate the central subspace. The qOPG objective function is first re-expressed using the gradient of the quantile functions in the reproducing kernel Hilbert space. A non-convex group penalty is then employed to the gradient term to obtain a sparse basis estimator for the central subspace. The selection consistency of the sparse qOPG is established, and its promising performance is demonstrated in both synthetic and real data applications.