EcoSta 2024: Start Registration
View Submission - EcoSta 2025
A0513
Title: Byzantine-tolerant distributed learning of finite mixture models Authors:  Qiong Zhang - Renmin University of China (China) [presenting]
Yan Shuo Tan - National University of Singapore (Singapore)
Jiahua Chen - University of British Columbia (Canada)
Abstract: Traditional statistical methods need to be updated to work with modern distributed data storage paradigms. A common approach is the split-and-conquer framework, which involves learning models on local machines and averaging their parameter estimates. However, this does not work for the important problem of learning finite mixture models because subpopulation indices on each local machine may be arbitrarily permuted (the "label switching problem''). Mixture reduction (MR) is proposed to address this issue, but MR remains vulnerable to Byzantine failure, whereby a fraction of local machines may transmit arbitrarily erroneous information. Distance-filtered mixture reduction (DFMR) is introduced, a Byzantine-tolerant adaptation of MR that is both computationally efficient and statistically sound. DFMR leverages the densities of local estimates to construct a robust filtering mechanism. By analyzing the pairwise $L^2$ distances between local estimates, DFMR identifies and removes severely corrupted local estimates while retaining the majority of uncorrupted ones. Theoretical justification is provided for DFMR, proving its optimal convergence rate and asymptotic equivalence to the global maximum likelihood estimate under standard assumptions. Numerical experiments on simulated and real-world data validate the effectiveness of DFMR in achieving robust and accurate aggregation in the presence of Byzantine failure.