CFE 2020: Start Registration
View Submission - CMStatistics
B1071
Title: Local elasticity: A phenomenological approach toward understanding deep learning Authors:  Weijie Su - The Wharton School, University of Pennsylvania (United States) [presenting]
Abstract: Motivated by the iterative nature of training neural networks, the following question arises: If the weights of a neural network are updated using the induced gradient on an image of a tiger, how does this update impact the prediction of the neural network at another image (say, an image of another tiger, a cat, or a plane)? To address this question, we will introduce a phenomenon termed local elasticity. Roughly speaking, our experiments show that modern deep neural networks are locally elastic in the sense that the change in prediction is likely to be most significant at another tiger and least significant at a plane, at late stages of the training process. We will illustrate some implications of local elasticity by relating it to the neural tangent kernel and improving on the generalization bound for uniform stability. Moreover, we will introduce a phenomenological model for simulating neural networks, which suggests that local elasticity may arise from the sharing of low-level and intermediate-level features. Finally, we will offer a local-elasticity-focused agenda for future research toward a theoretical foundation for deep learning.