CMStatistics 2023: Start Registration
View Submission - CMStatistics
B1201
Title: Deep nonlinear sufficient dimension reduction Authors:  Zhou Yu - East China Normal University (China) [presenting]
Abstract: Linear-sufficient dimension reduction, as exemplified by sliced inverse regression, has seen substantial development in the past thirty years. However, with the advent of more complex scenarios, nonlinear dimension reduction has become a more general topic that has gained considerable interest recently. A novel method for nonlinear sufficient dimension reduction is introduced, utilizing the generalized martingale difference divergence measure in conjunction with deep neural networks. The optimal solution of the objective function is shown to be unbiased at the general level of $\sigma$-fields. Two optimization schemes are considered, based on the fascinating deep neural networks, which exhibit higher efficiency and flexibility compared to the classical eigen decomposition of linear operators. Moreover, the slow rate and fast rate are systematically investigated for the estimation error based on advanced U-process theory. Remarkably, the fast rate is nearly minimax optimal. The effectiveness of the deep nonlinear-sufficient dimension reduction methods is demonstrated through simulations and real data analysis.