CMStatistics 2022: Start Registration
View Submission - CMStatistics
B1928
Title: On the infinite depth limit of finite width neural networks Authors:  Soufiane Hayou - National University of Singapore (Singapore) [presenting]
Abstract: The infinite depth limit of finite-width residual neural networks is discussed. The infinite-width limit of deep neural networks has been extensively studied. The converse (infinite depth) remains, however, poorly understood. With proper scaling, we show that by fixing the width and taking the depth to infinity, the vector of pre-activations converges in distribution to a zero-drift diffusion process that is essentially governed by the activation function. Unlike the infinite-width limit where the neurons exhibit a Gaussian behavior, we show that the infinite-depth limit (with finite width) yields different distributions depending on the choice of the activation function. We further discuss the sequential limit infinite-depth-then-infinite-width and show some key differences with the converse infinite-width-then-infinite-depth limit.