A1444
Title: Statistical inference for decentralized federated learning
Authors: Jia Gu - Center for Data Science, Zhejiang University (China) [presenting]
Songxi Chen - Peking University (China)
Abstract: The purpose is to consider decentralized federated learning (FL) under heterogeneous distributions among distributed clients or data blocks for the M-estimation. The mean squared error and consensus error across the estimators from different clients via the decentralized stochastic gradient descent algorithm are derived. The asymptotic normality of the Polyak-Ruppert(PR) averaged estimator in the decentralized distributed setting is attained, which shows that its statistical efficiency comes at a cost as it is more restrictive on the number of clients than that in the distributed M-estimation. To overcome the restriction, a one-step estimator is proposed, which permits a much larger number of clients while still achieving the same efficiency as the original PR-averaged estimator in the non-distributed setting. The confidence regions based on both the PR-averaged estimator and the proposed one-step estimator are constructed to facilitate statistical inference for decentralized federated learning.