A0732
Title: High-dimensional nonconvex penalized regression and post-selection least squares: A local asymptotic perspective
Authors: Xiaoya Xu - Shenzhen Polytechnic University (China) [presenting]
Abstract: In the realm of high-dimensional linear regression, nonconvex penalized estimators have enjoyed increasing popularity due to their much-acclaimed oracle property, which holds under assumptions weaker than those typically required for convex penalized estimators to enjoy the same property. However, the validity of such oracle property of nonconvex penalization and the accompanying inference tools is questionable in the presence of many weak signals and/or a few moderate signals, which may incur substantial biases. To address this issue, a more holistic assessment of the selection and convergence properties of nonconvex penalized estimators is first provided from a local asymptotic perspective under a framework that accommodates many weak signals and heavy tail conditions on covariates and random errors. It is then shown that post-selection least squares estimation has the beneficial effect of removing the bias incurred by nonconvex penalization of moderate signals. Post-selection least squares estimators acquire convergence properties that are more desirable than nonconvex penalized estimators and, in the case of multiple solutions to the nonconvex optimization program, are rate-wise more robust against the choice of selected sets. Empirical results obtained from large-scale simulation studies corroborate our theoretical findings. In particular, the post-selection least squares method improves nonconvex penalized estimation, especially under heavy-tailed settings.