CFE 2020: Start Registration
View Submission - CMStatistics
B0281
Title: Fast and powerful conditional randomization testing via distillation Authors:  Lucas Janson - Harvard University (United States) [presenting]
Abstract: Given a response $Y$ and covariates $(X, Z)$, we consider testing the null hypothesis that $Y$ is conditionally independent of $X$ given $Z$. The conditional randomization test (CRT) was recently proposed as a way to use distributional information about $X | Z$ to exactly control Type-I error using any test statistic in any dimensionality without assuming anything about $Y | (X,Z)$. This flexibility in principle allows one to derive powerful test statistics from complex state-of-the-art machine learning algorithms while maintaining statistical validity. Yet the direct use of such advanced test statistics in the CRT is prohibitively computationally expensive, especially with multiple testing, due to the CRT's requirement to recompute the test statistic many times on resampled data. We propose the distilled CRT, a novel approach to using state-of-the-art machine learning algorithms in the CRT while drastically reducing the number of times those algorithms need to be run, thereby taking advantage of their power and the CRT's statistical guarantees without suffering the usual computational expense. Indeed, we show in simulations that all our proposals combined lead to a test that has similar power to the CRT but requires orders of magnitude less computation, making it a practical tool even for large data sets. We demonstrate these benefits on a breast cancer dataset by identifying biomarkers related to cancer stage.