EcoSta 2024: Start Registration
View Submission - EcoSta2024
A0271
Title: Sparse online regression algorithm with insensitive loss functions Authors:  Ting Hu - School of Management, X\'an Jiaotong University (China) [presenting]
Abstract: A class of kernel-based online gradient descent algorithms is presented for addressing regression problems, which generates sparse estimators in an iterative way to reduce the algorithmic complexity for training streaming datasets and model selection in large-scale learning scenarios. In the setting of support vector regression, the sparse online learning algorithm is designed by introducing a sequence of insensitive distance-based loss functions. Consistency and error bounds are proven, quantifying the generalization performance of such algorithms under mild conditions. The theoretical results demonstrate the interplay between statistical accuracy and sparsity property during the learning process. It is shown that the insensitive parameter plays a crucial role in providing sparsity as well as fast convergence rates. The numerical experiments also support the theoretical results.