CMStatistics 2022: Start Registration
View Submission - CMStatistics
B0748
Title: FACT: High-dimensional random forests inference Authors:  Chien-Ming Chi - Academia Sinica (Taiwan) [presenting]
Yingying Fan - University of Southern California (United States)
Jinchi Lv - University of Southern California (United States)
Abstract: Random forests are one of the most widely used machine learning methods over the past decade. Yet, because of its black-box nature, the results by random forests can be hard to interpret in many big data applications. Quantifying the usefulness of individual features in random forests learning can greatly enhance its interpretability. Existing studies have shown that some popular feature importance measures for random forests suffer from the bias issue. In addition, there is a lack of comprehensive size and power analyses for most of these existing methods. We approach the problem via hypothesis testing, and suggest a framework of the self-normalized feature-residual correlation test (FACT) for evaluating the significance of a given feature in the random forests model with bias-resistance property, where our null hypothesis concerns whether the feature is conditionally independent of the response given all other features. The vanilla version of our FACT test can suffer from the bias issue in the presence of feature dependency. We exploit the techniques of imbalancing and conditioning for bias correction, and further incorporate the ensemble idea into the FACT statistic through feature transformations for enhanced power. Under a general high-dimensional nonparametric model setting with dependent features, we formally establish that FACT can provide theoretically justified random forests feature p-values and enjoy appealing power through nonasymptotic analyses.