A STOCHASTIC GRADIENT APPROACH FOR COMMUNICATION EFFICIENT CONFEDERATED LEARNING

Bin Wang, Jun Fang, Hongbin Li, Yonina C. Eldar

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In this work, we consider a multi-server federated learning (FL) framework, referred to as Confederated Learning (CFL), in order to accommodate a larger number of users. To reduce the communication overhead of the CFL system, we propose a linearly convergent stochastic gradient method. The proposed algorithm incorporates a conditionally-triggered user selection (CTUS) mechanism as the central component. Simulation results show that it achieves advantageous communication efficiency over GT-SAGA.

Original languageEnglish
Title of host publication2024 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024 - Proceedings
Pages5170-5174
Number of pages5
ISBN (Electronic)9798350344851
DOIs
StatePublished - 2024
Event49th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024 - Seoul, Korea, Republic of
Duration: 14 Apr 202419 Apr 2024

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
ISSN (Print)1520-6149

Conference

Conference49th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024
Country/TerritoryKorea, Republic of
CitySeoul
Period14/04/2419/04/24

Keywords

  • communication efficiency
  • Confederated learning
  • user selection

Fingerprint

Dive into the research topics of 'A STOCHASTIC GRADIENT APPROACH FOR COMMUNICATION EFFICIENT CONFEDERATED LEARNING'. Together they form a unique fingerprint.

Cite this