EcoSta 2024: Start Registration
View Submission - EcoSta2024
A0872
Title: Functional nonconvex penalization kernel smoothing for high-dimensional additive regression Authors:  Seyoung Park - Yonsei University (Korea, South) [presenting]
Abstract: Smooth backfitting has been shown to be a useful estimation technique for additive regression models in various scenarios. However, current studies have limitations because they are restricted to cases with a finite number of covariates or use only a penalized L1 method designed for high-dimensional settings. The iterative smooth backfitting algorithm, although simple and well-studied, tends to be very time-consuming, especially in high-dimensional settings. It has also been observed that one penalty can introduce significant estimation bias, whereas concave regularization can improve estimation. New kernel estimators and an efficient algorithm of nonconvex smooth backfitting for ultra-high dimensional additive models are presented. It is shown that the proposed nonconvex optimization has the oracle estimator as a unique stationary point. For the implementation of the proposed method, a composite gradient algorithm is designed and proven that the proposed iterative algorithm achieves a near-global optimum. The proposed algorithm allows parallel computation for updating the component functions in each iteration, significantly reducing the computational time compared to the existing iterative smooth backfitting algorithm.