TY - JOUR
T1 - Asynchronous Federated Learning over Wireless Communication Networks
AU - Wang, Zhongyu
AU - Zhang, Zhaoyang
AU - Tian, Yuqing
AU - Yang, Qianqian
AU - Shan, Hangguan
AU - Wang, Wei
AU - Quek, Tony Q.S.
N1 - Publisher Copyright:
IEEE
PY - 2022/9/1
Y1 - 2022/9/1
N2 - The conventional federated learning (FL) framework usually assumes synchronous reception and fusion of all the local models at the central aggregator and synchronous updating and training of the global model at all the agents as well. However, in a wireless network, due to limited radio resource, inevitable transmission failures and heterogeneous computing capacity, it is very hard to realize strict synchronization among all the involved user equipments (UEs). In this paper, we propose a novel asynchronous FL framework, which well adapts to the heterogeneity of users, communication environments and learning tasks, by considering both the possible delays in training and uploading the local models and the resultant staleness among the received models that has heavy impact on the global model fusion. A novel centralized fusion algorithm is designed to determine the fusion weight during the global update, which aims to make full use of the fresh information contained in the uploaded local models while avoiding the biased convergence by enforcing the impact of each UE’s local dataset to be proportional to its sample share. Numerical experiments validate that the proposed asynchronous FL framework can achieve fast and smooth convergence and enhance the training efficiency significantly.
AB - The conventional federated learning (FL) framework usually assumes synchronous reception and fusion of all the local models at the central aggregator and synchronous updating and training of the global model at all the agents as well. However, in a wireless network, due to limited radio resource, inevitable transmission failures and heterogeneous computing capacity, it is very hard to realize strict synchronization among all the involved user equipments (UEs). In this paper, we propose a novel asynchronous FL framework, which well adapts to the heterogeneity of users, communication environments and learning tasks, by considering both the possible delays in training and uploading the local models and the resultant staleness among the received models that has heavy impact on the global model fusion. A novel centralized fusion algorithm is designed to determine the fusion weight during the global update, which aims to make full use of the fresh information contained in the uploaded local models while avoiding the biased convergence by enforcing the impact of each UE’s local dataset to be proportional to its sample share. Numerical experiments validate that the proposed asynchronous FL framework can achieve fast and smooth convergence and enhance the training efficiency significantly.
UR - http://www.scopus.com/inward/record.url?scp=85125751390&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85125751390&partnerID=8YFLogxK
U2 - 10.1109/TWC.2022.3153495
DO - 10.1109/TWC.2022.3153495
M3 - Article
AN - SCOPUS:85125751390
SN - 1536-1276
JO - IEEE Transactions on Wireless Communications
JF - IEEE Transactions on Wireless Communications
ER -