A0185
Title: Highly robust training of regularized radial basis function networks
Authors: Jan Kalina - The Czech Academy of Sciences, Institute of Information Theory and Automation (Czech Republic) [presenting]
Abstract: Radial basis function (RBF) networks represent established tools for nonlinear regression modeling with numerous applications in various fields. Because their standard training is vulnerable with respect to the presence of outliers in the data, several robust methods for RBF network training have been proposed recently. The focus is on robust regularized RBF networks. A robust inter-quantile version of RBF networks based on trimmed least squares is proposed. Then, a systematic comparison of robust regularized RBF networks follows, which is evaluated over a set of 405 networks trained using various combinations of robustness and regularization types. The experiments proceed with a particular focus on the effect of variable selection, which is performed by means of a backward procedure, on the optimal number of RBF units. The regularized inter-quantile RBF networks based on trimmed least squares turn out to outperform the competing approaches in the experiments if a highly robust prediction error measure is considered.