CMStatistics 2023: Start Registration
View Submission - CMStatistics
B1526
Title: ReHLine: Regularized composite ReLU-ReHU loss minimization with linear computation and linear convergence Authors:  Yixuan Qiu - Shanghai University of Finance and Economics (China) [presenting]
Ben Dai - The Chinese University of Hong Kong (China)
Abstract: Empirical risk minimization (ERM) is a crucial framework that offers a general approach to handling a broad range of machine learning tasks. A novel algorithm, called ReHLine, is proposed for minimizing a set of regularized ERMs with convex piecewise linear-quadratic loss functions and optional linear constraints. The proposed algorithm can effectively handle diverse combinations of loss functions, regularizations, and constraints, making it particularly well-suited for complex domain-specific problems. Examples of such problems include FairSVM, elastic net regularized quantile regression, Huber minimization, etc. In addition, ReHLine enjoys a provable linear convergence rate and exhibits a per-iteration computational complexity that scales linearly with the sample size. The algorithm is implemented with both Python and R interfaces, and its performance is benchmarked on various tasks and datasets. The experimental results demonstrate that ReHLine surpasses generic optimization solvers by around 1000x in terms of computational efficiency on large-scale datasets. Moreover, it also outperforms specialized solvers such as LIBLINEAR in SVM and hqreg in Huber minimization, exhibiting exceptional flexibility and efficiency.