FEDERATED STOCHASTIC GRADIENT DESCENT BEGETS SELF-INDUCED MOMENTUM

Howard H. Yang, Zuozhu Liu, Yaru Fu, Tony Q.S. Quek, H. Vincent Poor

研究成果: Conference contribution

摘要

Federated learning (FL) is an emerging machine learning method that can be applied in mobile edge systems, in which a server and a host of clients collaboratively train a statistical model utilizing the data and computation resources of the clients without directly exposing their privacy-sensitive data. We show that running stochastic gradient descent (SGD) in such a setting can be viewed as adding a momentum-like term to the global aggregation process. Based on this finding, we further analyze the convergence rate of a federated learning system by accounting for the effects of parameter staleness and communication resources. These results advance the understanding of the Federated SGD algorithm, and also forges a link between staleness analysis and federated computing systems, which can be useful for systems designers.

原文English
主出版物標題2022 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022 - Proceedings
發行者Institute of Electrical and Electronics Engineers Inc.
頁面9027-9031
頁數5
ISBN(電子)9781665405409
DOIs
出版狀態Published - 2022
事件47th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022 - Virtual, Online, Singapore
持續時間: 2022 5月 232022 5月 27

出版系列

名字ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
2022-May
ISSN(列印)1520-6149

Conference

Conference47th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022
國家/地區Singapore
城市Virtual, Online
期間22-05-2322-05-27

All Science Journal Classification (ASJC) codes

  • 軟體
  • 訊號處理
  • 電氣與電子工程

指紋

深入研究「FEDERATED STOCHASTIC GRADIENT DESCENT BEGETS SELF-INDUCED MOMENTUM」主題。共同形成了獨特的指紋。

引用此