Boosting Semi-Supervised Federated Learning with Model Personalization and Client-Variance-Reduction

Shuai Wang, Yanqing Xu, Yanli Yuan, Xiuhua Wang, Tony Q.S. Quek

研究成果: Conference article同行評審

1 引文 斯高帕斯(Scopus)


Recently, federated learning (FL) has been increasingly appealing in distributed signal processing and machine learning. Nevertheless, the practical challenges of label deficiency and client heterogeneity form a bottleneck to its wide adoption. Although numerous efforts have been devoted to semi- supervised FL, most of the adopted algorithms follow the same spirit as FedAvg, thus heavily suffering from the adverse effects caused by client heterogeneity. In this paper, we boost the semi-supervised FL by addressing the issue using model personalization and client-variance-reduction. In particular, we propose a novel and unified problem formulation based on pseudo-labeling and model interpolation. We then propose an effective algorithm, named FedCPSL, which judiciously adopts the schemes of a novel momentum-based client- variance-reduction and normalized averaging. Convergence property of FedCPSL is analyzed and shows that FedCPSL is resilient to client heterogeneity and obtains a sublinear convergence rate. Experimental results on image classification tasks are also presented to demonstrate the efficacy of FedCPSL over the benchmark algorithms.

All Science Journal Classification (ASJC) codes

  • 軟體
  • 訊號處理
  • 電氣與電子工程


深入研究「Boosting Semi-Supervised Federated Learning with Model Personalization and Client-Variance-Reduction」主題。共同形成了獨特的指紋。