EcoSta 2023: Start Registration
View Submission - EcoSta2023
A0902
Title: Contrastive learning: An expansion and shrinkage perspective Authors:  Yiqiao Zhong - UW Madison (United States) [presenting]
Cong Ma - University of Chicago (United States)
Yu Gui - University of Chicago (United States)
Abstract: Contrastive learning is an unsupervised learning framework that has recently received much attention in the deep learning community. It achieves remarkable empirical performance, especially when no or few labelled training examples are available. Compared with the usual encoder-decoder structure in autoencoders, contrastive learning introduces positive/negative pairs and replaces the decoder with a projector. Intriguing puzzles about the role of projectors and dimensional collapse phenomena are often reported but not fully understood. Two major effects are identified, namely expansion and shrinkage, that contrastive learning promotes. The analysis is based on the Gaussian mixture model, which allows a systematic treatment despite its simplicity. The analysis reveals a rich phase transition phenomenon and characterizes generalization properties on downstream tasks, which closely match experimental results. The expansion and shrinkage perspective is a step toward demystifying the empirical puzzles and can potentially improve practice in self-supervised learning.