A0865
Title: Adversarially robust subspace learning in the spiked covariance model
Authors: Fei Sha - University of Nebraska-Lincoln (United States)
Ruizhi Zhang - University of Nebraska-Lincoln (United States) [presenting]
Abstract: The problem of robust subspace learning is studied when there is an adversary who can attack the data to increase the projection error. By deriving the adversarial projection risk when data follows the multivariate Gaussian distribution with the spiked covariance or so-called Spiked Covariance model, we propose to use the empirical risk minimization method to obtain the optimal robust subspace. We then find a non-asymptotic upper bound of the adversarial excess risk, which implies the empirical risk minimization estimator is close to the optimal robust adversarial subspace. The optimization problem can be solved easily by the projected gradient descent algorithm for the rank-one spiked covariance model. However, in general, it is computationally intractable to solve the empirical risk minimization problem. Thus, we propose to minimize the upper bound of the empirical risk to find the robust subspace for the general spiked covariance model.