EcoSta 2024: Start Registration
View Submission - EcoSta2024
A0799
Title: Generalization analysis of deep ReLU networks for functional learning Authors:  Linhao Song - Central South University (China) [presenting]
Abstract: Learning nonlinear functionals defined on $L^p([-1,1]^s)$ for $1\le p\le \infty$ and $s\in\mathbb{N}$ is a significant learning task in broad applications. As a powerful tool of the nonparametric approach, neural networks that take functions as input were recently designed and employed to learn nonlinear functionals and achieved great success in practice. However, the underlying theoretical analysis of this approach lags heavily behind. For instance, the universal consistency and learning rates remain open. The least-squares regression problem is considered using functional neural networks with the rectified linear unit (ReLU) activation function. Within the framework of learning theory, it is shown that the learning algorithm is universally consistent. Besides, generalization error bounds are also established, showing a trade-off between the approximation ability and the capacity of the approximants measured by covering numbers. Based on this, the learning rates are further investigated under diverse assumptions on the target functional and the input function space.