EcoSta 2022: Start Registration
View Submission - EcoSta2022
A0672
Title: Learning theory of stochastic gradient descent Authors:  Yunwen Lei - The University of Hong Kong (Hong Kong) [presenting]
Abstract: Stochastic Gradient Descent (SGD) has become the workhorse behind many machine learning problems. Despite its promising success in applications, the theoretical analysis is still not satisfactory. We will discuss the learning theory of SGD. We will introduce new algorithmic stability concepts to relax the existing restrictive assumptions and improve the existing learning rates. Our results show new connections between generalization and optimization, which illustrate how a better learning performance can be achieved by early stopping.