CMStatistics 2019: Start Registration
View Submission - CMStatistics
B2005
Title: Neural tangent kernel: Dynamics of infinitely wide DNNs Authors:  Arthur Jacot - Ecole Polytechnique Federale (Switzerland) [presenting]
Franck Gabriel - Ecole Polytechnique Federale (Switzerland)
Clement Hongler - Ecole Polytechnique Federale (Switzerland)
Abstract: Modern deep learning has popularized the use of very large neural networks, but the theoretical tools to study such networks are still lacking. The Neural Tangent Kernel (NTK) describes how the output neutrons evolve during training. In the infinite width limit (when the number of hidden neutrons grows to infinity) the NTK converges to a deterministic and fixed limit, leading to a simple description of the dynamics of infinitely wide DNNs and showing a link to kernel methods. The NTK is affected by the architecture of the network, so it is helpful to understand how certain architecture choices affect the convergence and generalization of DNNs.