EcoSta 2019: Start Registration
View Submission - EcoSta2019
A0365
Title: Theory of deep convolutional neural networks for deep learning Authors:  Dingxuan Zhou - University of Sydney (Australia) [presenting]
Abstract: Deep learning has been widely applied and brought breakthroughs in speech recognition,computer vision, and many other domains. The involved deep neural network architectures and computational issues have been well studied in machine learning. But there lacks a theoretical foundation for understanding the approximation or generalization ability of deep learning methods with network architectures such as deep convolutional neural networks (CNNs) with convolutional structures. The convolutional architecture gives essential differences between the deep CNNs and fully-connected deep neural networks, and the classical approximation theory of fully-connected networks developed around 30 years ago does not apply. An approximation theory of deep CNNs associated with the rectified linear unit is described. In particular, we show the universality of such a deep CNN, meaning that it can be used to approximate any continuous function to an arbitrary accuracy when the depth of the neural network is large enough. Our quantitative estimate, given tightly in terms of the number of free parameters to be computed, verifies the efficiency of deep CNNs in dealing with large dimensional data.