Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
FedCO: Communication-Efficient Federated Learning via Clustering Optimization †
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.ORCID iD: 0000-0001-6061-0861
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.ORCID iD: 0000-0003-3128-191x
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science.ORCID iD: 0000-0002-3118-5058
2022 (English)In: Future Internet, E-ISSN 1999-5903, Vol. 14, no 12, article id 377Article in journal (Refereed) Published
Abstract [en]

Federated Learning (FL) provides a promising solution for preserving privacy in learning shared models on distributed devices without sharing local data on a central server. However, most existing work shows that FL incurs high communication costs. To address this challenge, we propose a clustering-based federated solution, entitled Federated Learning via Clustering Optimization (FedCO), which optimizes model aggregation and reduces communication costs. In order to reduce the communication costs, we first divide the participating workers into groups based on the similarity of their model parameters and then select only one representative, the best performing worker, from each group to communicate with the central server. Then, in each successive round, we apply the Silhouette validation technique to check whether each representative is still made tight with its current cluster. If not, the representative is either moved into a more appropriate cluster or forms a cluster singleton. Finally, we use split optimization to update and improve the whole clustering solution. The updated clustering is used to select new cluster representatives. In that way, the proposed FedCO approach updates clusters by repeatedly evaluating and splitting clusters if doing so is necessary to improve the workers’ partitioning. The potential of the proposed method is demonstrated on publicly available datasets and LEAF datasets under the IID and Non-IID data distribution settings. The experimental results indicate that our proposed FedCO approach is superior to the state-of-the-art FL approaches, i.e., FedAvg, FedProx, and CMFL, in reducing communication costs and achieving a better accuracy in both the IID and Non-IID cases. © 2022 by the authors.

Place, publisher, year, edition, pages
MDPI, 2022. Vol. 14, no 12, article id 377
Keywords [en]
clustering, communication efficiency, convolutional neural network, federated learning, Internet of Things, Convolutional neural networks, Cost reduction, Learning systems, Privacy-preserving techniques, Central servers, Clustering optimizations, Clusterings, Communication cost, Optimization approach, Shared model, Workers'
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:bth-24176DOI: 10.3390/fi14120377ISI: 000901037100001Scopus ID: 2-s2.0-85144590253OAI: oai:DiVA.org:bth-24176DiVA, id: diva2:1725283
Note

open access

Available from: 2023-01-10 Created: 2023-01-10 Last updated: 2023-08-03Bibliographically approved

Open Access in DiVA

fulltext(2836 kB)257 downloads
File information
File name FULLTEXT01.pdfFile size 2836 kBChecksum SHA-512
d7b4c2ded5a960f3a477aac61aa621889e1af5086fbf6f4af70bb0949069d7a41ed3a63e6b57d5c61f3e1323d3471e7c817245c2124bfe6ae38377dd2fe4156a
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopus

Authority records

Al-Saedi, Ahmed Abbas MohsinBoeva, VeselkaCasalicchio, Emiliano

Search in DiVA

By author/editor
Al-Saedi, Ahmed Abbas MohsinBoeva, VeselkaCasalicchio, Emiliano
By organisation
Department of Computer Science
In the same journal
Future Internet
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 257 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 401 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf