EcoSta 2024: Start Registration
View Submission - EcoSta 2025
A0796
Title: Regularization in mixture-of-experts in ultra-high dimensional feature spaces Authors:  Abbas Khalili - McGill University (Canada) [presenting]
Abstract: Mixture-of-experts (MoE) provides a flexible statistical model to capture unobserved heterogeneity in data. In modern applications of MoEs, the dimension of the feature space is large compared to a typical sample size of a training dataset, and hence, statistical inference becomes intrinsically challenging. Penalized likelihood estimation and feature selection methods are proposed and studied in ultrahigh-dimensional spaces for sparse MoEs, where the experts belong to generalized linear regression models (GLM). These models are referred to as sparse MoE-GLM. Under general conditions, consistency is established in the estimation and feature selection for the proposed methods. The empirical performance of the methods is assessed through extensive simulations, and their application is illustrated in a real data analysis. The work offers a comprehensive and detailed analysis of regularization methods for sparse MoE-GLM models in an ultrahigh-dimensional setting.