A0941
Title: Differentially private geodesic regression
Authors: Carlos Soto - University of Massachusetts Amherst (United States) [presenting]
Aditya Kulkarni - University of Massachusetts Amherst (United States)
Abstract: In statistical applications, it has become increasingly common to encounter data structures that live on nonlinear spaces such as manifolds. Classical linear regression, one of the most fundamental methodologies of statistical learning, captures the relationship between an independent variable and a response variable, which both are assumed to live in Euclidean space. Thus, geodesic regression emerged as an extension where the response variable lives on a Riemannian manifold. The parameters of geodesic regression, as with linear regression, capture the relationship of sensitive data, and, hence, one should consider the privacy protection practices of said parameters. Releasing differentially private (DP) parameters of geodesic regression is considered via the k-norm gradient (KNG) mechanism for Riemannian manifolds. Theoretical bounds are derived for the sensitivity of the parameters, showing they are tied to their respective Jacobi fields and hence the curvature of the space. This corroborates recent findings of differential privacy for the Fr\'echet mean. The efficacy of the methodology is demonstrated on the sphere, $\mathbb{S^2}\subset\mathbb{R^3}$, and since it is general to Riemannian manifolds, the manifold of Euclidean space, which simplifies geodesic regression to a case of linear regression. The methodology is general to any Riemannian manifold and thus it is suitable for data in domains such as medical imaging and computer vision.