EcoSta 2022: Start Registration
View Submission - EcoSta2022
A0330
Title: Convergence of stochastic gradient descent algorithms for functional data learning Authors:  Xiaming Chen - Shantou University (China)
Bohao Tang - Johns Hopkins Bloomberg School of Public Health (United States)
Jun Fan - Hong Kong Baptist University (Hong Kong)
Zheng-Chu Guo - Zhejiang University (China)
Lei Shi - Fudan University (China)
Xin Guo - The University of Queensland (Australia) [presenting]
Abstract: Functional linear models are a fruitfully applied general framework for regression problems, including those with intrinsically infinite-dimensional data. Online gradient descent methods, despite their evidenced power of processing online or large-sized data, are not well studied for learning with functional data. We study reproducing kernel-based online learning algorithms for functional data, under both vanishing step-size and finite horizon settings. We derive convergence rates for both the prediction and estimation problems. In particular, our analysis suggests that convergence for the prediction problems requires much weaker regularity assumptions than that of the estimation problems.