TY - GEN
T1 - Personalizing Federated Learning with Over-The-Air Computations
AU - Chen, Zihan
AU - Li, Zeshen
AU - Yang, Howard H.
AU - Quek, Tony Q.S.
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Federated edge learning is a promising technology to deploy intelligence at the edge of wireless networks in a privacy-preserving manner. Under such a setting, multiple clients collaboratively train a global generic model under the coordination of an edge server. But the training efficiency is often hindered by challenges arising from limited communication and data heterogeneity. In this paper, we present a distributed training paradigm that employs analog over-the-air computation to alleviate the communication bottleneck. Additionally, we leverage a bi-level optimization framework to personalize the federated learning model so as to cope with the data heterogeneity issue. As a result, it enhances the generalization and robustness of each client's local model. We elaborate on the model training procedure and its advantages over conventional frameworks. We provide a convergence analysis that theoretically demonstrates the training efficiency. We also conduct extensive experiments to validate the efficacy of the proposed framework.
AB - Federated edge learning is a promising technology to deploy intelligence at the edge of wireless networks in a privacy-preserving manner. Under such a setting, multiple clients collaboratively train a global generic model under the coordination of an edge server. But the training efficiency is often hindered by challenges arising from limited communication and data heterogeneity. In this paper, we present a distributed training paradigm that employs analog over-the-air computation to alleviate the communication bottleneck. Additionally, we leverage a bi-level optimization framework to personalize the federated learning model so as to cope with the data heterogeneity issue. As a result, it enhances the generalization and robustness of each client's local model. We elaborate on the model training procedure and its advantages over conventional frameworks. We provide a convergence analysis that theoretically demonstrates the training efficiency. We also conduct extensive experiments to validate the efficacy of the proposed framework.
UR - http://www.scopus.com/inward/record.url?scp=85177565195&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85177565195&partnerID=8YFLogxK
U2 - 10.1109/ICASSP49357.2023.10095533
DO - 10.1109/ICASSP49357.2023.10095533
M3 - Conference contribution
AN - SCOPUS:85177565195
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
BT - ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing, Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 48th IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2023
Y2 - 4 June 2023 through 10 June 2023
ER -