EcoSta 2024: Start Registration
View Submission - EcoSta 2025
A1165
Title: Weight matrices compression based on PDB model in deep neural networks Authors:  Zeng Li - Southern University of Science and Technology (China) [presenting]
Abstract: Weight matrix compression has been demonstrated to effectively reduce overfitting and improve the generalization performance of deep neural networks. Compression is primarily achieved by filtering out noisy eigenvalues of the weight matrix. A novel population double bulk (PDB) model is proposed to characterize the eigenvalue behavior of the weight matrix, which is more general than the existing population unit bulk (PUB) model. Based on the PDB model and random matrix theory (RMT), a new PDBLS algorithm is discovered for determining the boundary between noisy eigenvalues and information. A PDB Noise-Filtering algorithm is further introduced to reduce the rank of the weight matrix for compression. Experiments show that the PDB model fits the empirical distribution of eigenvalues of the weight matrix better than the PUB model, and our compressed weight matrices have a lower rank at the same level of test accuracy. In some cases, the compression method can even improve generalization performance when labels contain noise.