CMStatistics 2022: Start Registration
View Submission - CMStatistics
B0701
Title: Algorithmic Gaussianization through sketching: Converting data into sub-Gaussian random designs Authors:  Michal Derezinski - University of Michigan (United States) [presenting]
Abstract: Algorithmic Gaussianization is a phenomenon that can arise when using randomized sketching or sampling methods to produce smaller representations of large datasets: For certain tasks, these sketched representations have been observed to exhibit many robust performance characteristics that are known to occur when a data sample comes from a sub-gaussian random design, which is a powerful statistical model of data distributions. However, this phenomenon has only been studied for specific tasks and metrics, or by relying on computationally expensive methods. We address this by providing an algorithmic framework for Gaussianizing data distributions via averaging, proving that it is possible to efficiently construct data sketches that are nearly indistinguishable (in terms of total variation distance) from sub-gaussian random designs. In particular, relying on a recently introduced sketching technique called Leverage Score Sparsified (LESS) embeddings, we show that one can construct an $n\times d$ sketch of an $N\times d$ matrix $A$, where $n << N$, that is nearly indistinguishable from a sub-Gaussian design, in time nearly linear in the size of $A$. As a consequence, strong statistical guarantees and precise asymptotics available for the estimators produced from sub-gaussian designs can be straightforwardly adapted to our sketching framework. We illustrate this with a new approximation guarantee for sketched least squares.