EcoSta 2022: Start Registration
View Submission - EcoSta2022
A0205
Title: Simple stochastic and online gradient decent algorithms for pairwise learning Authors:  Yiming Ying - University of Sydney (Australia) [presenting]
Abstract: Pairwise learning refers to learning tasks where the loss function depends on a pair of instances. It instantiates many important machine learning tasks such as bipartite ranking and metric learning. A popular approach to handle streaming data in pairwise learning is an online gradient descent (OGD) algorithm, where one needs to pair the current instance with a buffering set of previous instances with a sufficiently large size and therefore suffers from a scalability issue. We will present our recent proposal of simple stochastic and online gradient descent methods for pairwise learning. A notable difference from the existing studies is that our proposed method only pairs the current instance with the previous one in building a gradient direction, which is efficient in both the storage and computational complexity. We will present novel stability results, optimization, and generalization error bounds for both convex and nonconvex as well as both smooth and nonsmooth problems. The study resolves an open question on developing meaningful generalization bounds for OGD using a buffering set with a very small fixed size. We also extend our algorithms and stability analysis to develop differentially private SGD algorithms for pairwise learning which significantly improves the existing results.