B1557
Title: Minimum error entropy principle for robust deep learning
Authors: Jun Fan - Hong Kong Baptist University (Hong Kong) [presenting]
Abstract: Information-theoretic learning is a machine learning approach that integrates concepts from information theory. Within this framework, the principle of minimum error entropy plays a crucial role and offers a family of supervised learning algorithms. These algorithms are an alternative to the traditional least squares method, particularly when dealing with heavy-tailed noise or outliers. In recent years, integrating information-theoretic learning and deep learning has gained tremendous attention to tackle the evolving challenges in modern machine learning. Delved into minimum error entropy algorithms generated by deep convolutional neural networks in a regression setting. Learning rates are presented when the noise satisfies a weak moment condition.