B1142
Title: Testing the linear mean and constant variance conditions in sufficient dimension reduction
Authors: Yuexiao Dong - Temple University (United States) [presenting]
Abstract: Sufficient dimension reduction methods characterize the relationship between the response $Y$ and the covariates $X$, through a few linear combinations of the covariates. Extensive techniques are developed, among which the inverse regression-based methods are perhaps the most appealing in practice because they do not involve multi-dimensional smoothing and are easy to implement. However, these inverse regression-based methods require two distributional assumptions on the covariates. In particular, the first-order methods, such as the sliced inverse regression, require the linear conditional mean (LCM) assumption, while the second-order methods, such as the sliced average variance estimation, require additionally the constant conditional variance (CCV) assumption. We propose to check the validity of the LCM and the CCV conditions through mean independence tests, which are facilitated by the martingale difference divergence. We suggest a consistent bootstrap procedure to decide the critical values of the test. Monte Carlo simulations as well as an application to the horse mussels dataset are conducted to demonstrate the finite sample performance of our proposal.