Title: Robust discriminant analysis for high dimensions
Authors: Valentin Todorov - UNIDO (Austria) [presenting]
Peter Filzmoser - Vienna University of Technology (Austria)
Abstract: The classical discriminant methods (LDA and QDA) could suffer from the singularity problem in cases of high-dimensional small sample size data, which limits their practical application. A number of regularization techniques with the purpose to stabilize the classifier and to achieve an improved classification performance have been developed and there exist several studies comparing various regularization techniques trying to facilitate the choice of a method. However, none of these methods takes into consideration the possible presence of outliers in the training data set which can strongly influence the obtained classification rules and make the results unreliable. On the other hand, the high breakdown point versions of discriminant analysis (with one exception) proposed in the literature do not work or are not reliable in high dimensions. The method we propose relies on the recently introduced regularized versions of the minimum covariance determinant (MCD) estimator - RMCD and MRCD - and combines high robustness to outliers, the possibility to be computed for high dimensions and readily available software in R. Simulated and real data examples show that the proposed method performs better than, or at least as well as, the existing methods in a wide range of settings.