CMStatistics 2021: Start Registration
View Submission - CMStatistics
B0887
Title: Benign overfitting in distributed learning Authors:  Nicole Muecke - TU Braunschweig (Germany) [presenting]
Abstract: While large training datasets generally offer improvement in model performance, the training process becomes computationally expensive and time-consuming. Distributed learning (DL) is a common strategy to reduce the overall training time by exploiting multiple computing devices. Recently it has been observed in the single machine setting that overparameterization is essential for benign overfitting. We analyze distributed ridgeless linear regression and show that overparameterization is essential for benign overfitting also in the distributed setting. Moreover, we show that both the covariance structure and the hardness of the learning expressed in terms of a source condition determine the efficiency of DL in the presence of overparameterization. The results show that the number of local nodes acts as an explicit regularization parameter and the efficiency may increase linearly with the number of machines. This is in stark contrast to the underparameterized regime where the efficiency is known to be decreasing linearly.