A0314
Title: Network Gradient Descent and Expectation-MaximizationAlgorithms in Decentralized Federated Learning
Authors: Shuyuan Wu - Shanghai University of Finance and Economics (China) [presenting]
Abstract: We study two fully decentralized federated learning algorithms: Network Gradient Descent (NGD) and Network Expectation Maximization (NEM) algorithms. In these methods, clients communicate only parameter estimators, minimizing privacy risks and enhancing reliability through a specially designed network structure. Our theoretical analysis demonstrates that the learning rate and network structure significantly affect the statistical efficiency of the resulting estimators. With a sufficiently small learning rate and a well balanced network, the estimators achieve statistical efficiency comparableto that of the global estimator, even with heterogeneous data distributions. Extensive simulations and real data analyses are conducted to validate our theoretical findings.