CMStatistics 2017: Start Registration
View Submission - CMStatistics
B0457
Title: Learning from MOM's principle: Theoretical results Authors:  Matthieu Lerasle - CNRS (France) [presenting]
Guillaume Lecue - CNRS and ENSAE (France)
Abstract: A new estimator obtained by minimaximization of Median-Of-Means (MOM) criteria is presented which can be regularized when some structure is assumed on the signal to be estimated. This estimator is particularly relevant in an environment corrupted by outliers, that may be as aggressive as they want. Informative data (i.e. data that are not outliers) are only asked to be independent (not necessarily identically distributed) and to satisfy a weak L2 / L1 moment comparison assumption. In this setting, our estimator performs as well as the ERM and its regularized version would do in a friendly i.i.d. sub-Gaussian environment provided that the number of outliers does not exceed (number of data)*(minimax rate of convergence). These performances are achieved for many risk functions, even for some quite sensitive to the presence of outliers as the quadratic loss, as well as for any regularization norm. A particular emphasis will be put on the $l_1, S_1$ and SLOPE norms.