CFE-CMStatistics 2024: Start Registration
View Submission - CFECMStatistics2024
A1719
Title: Computationally efficient sparse sufficient dimension reduction via least squares svm and its extensions Authors:  Jungmin Shin - Korea University (Korea, South) [presenting]
Seung Jun Shin - Korea University (Korea, South)
Abstract: Sparse sufficient dimension reduction (SDR) is explored through the framework of the penalized principal machine (PM). We proposed the penalized principal least square support vector machine (P2LSM) as a primary example of this approach. The P2LSM employs a squared loss function, facilitating efficient computation using the group coordinate descent algorithm. Our method enhances computational efficiency compared to conventional sparse SDR methods, particularly for large-scale datasets. We also extend our approach to include penalized principal asymmetric least squares regression (P2AR), penalized principal L2-SVM (P2L2M), and penalized principal (weighted) logistic regression (P2(W)LR), demonstrating its versatility. Additionally, the computational advantage and oracle property of the proposed methods are investigated. Extensive simulations and real data analyses illustrate the efficacy of the proposed methods in yielding sparse, interpretable solutions without compromising predictive accuracy.