CMStatistics 2023: Start Registration
View Submission - CMStatistics
B1062
Title: Real log canonical threshold: A complexity measure for deep neural networks Authors:  Susan Wei - University of Melbourne (Australia) [presenting]
Abstract: The number of weights in a deep neural network does not adequately reflect its complexity. Instead, singular learning theory suggests that the complexity of singular models, like neural networks, should be assessed using the real log canonical threshold (RLCT). The RLCT is an invariant of the underlying model-truth-prior triplet, derived from Hironaka's resolution of singularities applied to the KL divergence. In this context, a computationally efficient estimator is proposed for the RLCT, which effectively captures the complexity of a learned neural network. The complexity of learned neural networks is demonstrated when employing common stochastic optimizers, constituting only a small fraction of the total number of weights.