EcoSta 2024: Start Registration
View Submission - EcoSta2024
A0233
Title: The non-overlapping approximation to overlapping group lasso Authors:  Tianxi Li - University of Minnesota (United States) [presenting]
Abstract: The group lasso penalty is widely used to introduce structured sparsity in statistical learning, characterized by its ability to eliminate predefined groups of parameters automatically. However, when the groups are overlapping, solving the group lasso problem can be time-consuming in high-dimensional settings because of the non-separability induced by the groups. This difficulty has significantly limited the penalty's applicability in cutting-edge computational areas, such as gene pathway selection and graphical model estimation. A non-overlapping and separable penalty is introduced to efficiently approximate the overlapping group lasso penalty. The approximation substantially improves the computational efficiency in optimization, especially for large-scale and high-dimensional problems. It is shown that the proposed penalty is the tightest separable relaxation of the overlapping group lasso norm within a broad family of norms. Furthermore, the estimators based on the proposed norm are statistically equivalent to those derived from the overlapping group lasso in terms of estimation error, support recovery, and minimax rate under the squared loss. The method's effectiveness is demonstrated through extensive simulation examples and a predictive task of cancer tumors.