Sort:
Open Access Issue
ASCFL: Accurate and Speedy Semi-Supervised Clustering Federated Learning
Tsinghua Science and Technology 2023, 28 (5): 823-837
Published: 19 May 2023
Abstract PDF (4.6 MB) Collect
Downloads:198

The influence of non-Independent Identically Distribution (non-IID) data on Federated Learning (FL) has been a serious concern. Clustered Federated Learning (CFL) is an emerging approach for reducing the impact of non-IID data, which employs the client similarity calculated by relevant metrics for clustering. Unfortunately, the existing CFL methods only pursue a single accuracy improvement, but ignore the convergence rate. Additionlly, the designed client selection strategy will affect the clustering results. Finally, traditional semi-supervised learning changes the distribution of data on clients, resulting in higher local costs and undesirable performance. In this paper, we propose a novel CFL method named ASCFL, which selects clients to participate in training and can dynamically adjust the balance between accuracy and convergence speed with datasets consisting of labeled and unlabeled data. To deal with unlabeled data, the prediction labels strategy predicts labels by encoders. The client selection strategy is to improve accuracy and reduce overhead by selecting clients with higher losses participating in the current round. What is more, the similarity-based clustering strategy uses a new indicator to measure the similarity between clients. Experimental results show that ASCFL has certain advantages in model accuracy and convergence speed over the three state-of-the-art methods with two popular datasets.

Total 1