Stable Federated Learning with Dataset Condensation

Seong Woong Kim, Dong Wan Choi

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Federated learning (FL) is a new machine learning paradigm, where multiple clients learn their local models to collaboratively integrate into a single global model. Unlike centralized learning, the global model being integrated cannot be tested in FL as the server does not collect any data samples, further, the global model is often sent back and immediately applied to clients even at the middle of training such as Gboard. Therefore, if the performance of the global model is not stable, which is, unfortunately, the case in many FL scenarios with non-IID data, clients can be provided with an inaccurate model. This paper explores the main reason for this training instability of FL, that is, what we call temporary imbalance that happens across rounds, leading to loss of knowledge from previous rounds. To solve this problem, we propose a dataset condensation method to summarize the local data for each client without compromising on privacy. The condensed data are transmitted to the server with the local model and utilized by the server to ensure stable and consistent performance of the global model. Experimental results show that the global model not only achieves training stability but also exhibits a fast convergence speed.

Original languageEnglish
Pages (from-to)52-62
Number of pages11
JournalJournal of Computing Science and Engineering
Volume16
Issue number1
DOIs
StatePublished - Mar 2022

Bibliographical note

Publisher Copyright:
© 2022. The Korean Institute of Information Scientistsand Engineers. All Rights Reserved.

Keywords

  • Class imbalance
  • Dataset compression
  • Deep learning
  • Federated learning

Fingerprint

Dive into the research topics of 'Stable Federated Learning with Dataset Condensation'. Together they form a unique fingerprint.

Cite this