A0457
Title: Robust clustered multi-task learning via outlier task detection with non-convex penalties
Authors: Akira Okazaki - The Institute of Statistical Mathematics (Japan) [presenting]
Shuichi Kawano - Kyushu University (Japan)
Abstract: Multi-task learning (MTL) is a methodology that aims to improve the general performance of estimation and prediction by sharing common information among related tasks. In the MTL, one of the natural assumptions is that tasks are classified into some clusters with their characteristics. MTL methods under this assumption are performed by simultaneously estimating the task parameters and clustering them. However, the assumption that all tasks are classified into some clusters is too strict because outlier tasks that have no common information may exist in some practical situations. If the task set is contaminated by some outlier tasks, the estimation accuracy of MTL deteriorates. To overcome this problem, an MTL method is proposed based on clustering with outlier parameters. The outlier parameters represent non-explained parts by the center of the cluster parameters and constrained task-specific parameters. Outlier parameter vectors are selected via group sparse regularization, providing robustness of clustering against outlier tasks. The effectiveness of the proposed method is shown through Monte Carlo simulations and applications to real data.