CMStatistics 2021: Start Registration
View Submission - CMStatistics
B1020
Title: Concentration inequalities, optimal number of layers and classification fallacy of a stochastic neural network Authors:  Michele Caprio - University of Pennsylvania (United States) [presenting]
Sayan Mukherjee - Duke University (United States)
Abstract: Concentration inequalities are given for the output of the hidden layers of a stochastic feedforward neural network with ReLU activation, as well as for the output of the whole neural network. In addition, if the neural network is a martingale, we find a martingale inequality for the output of the hidden layers and of the whole neural network; we also identify the optimal number of layers for the neural network via an optimal stopping procedure. Finally, in the context of a two-category classification stochastic neural network, we give an approximation of the behavior of the classifier, and we give a probabilistic bound for the loss of accuracy resulting from this approximation.