Efficient parameter aggregation in federated learning with hybrid convergecast

Yangyang Tao, Junxiu Zhou, Shucheng Yu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Scopus citations

Abstract

In federated learning, workers train local models with their private data sets and only upload local gradients to the remote aggregator. Data privacy is well preserved and parallelism is achieved. In large-scale deep learning tasks, however, frequent interactions between workers and the aggregator to transmit parameters can cause tremendous degradation of system performance in terms of communication costs, the needed number of iterations, the latency of each iteration and the accuracy of the trained model because of system 'churns' (i.e., devices frequently joining and leaving the network). Existing research leverages different network topologies to improve the performance of federated learning. In this paper, we propose a novel hybrid network topology design that integrates ring (R) and n-ary tree (T) to provide flexible and adaptive convergecast in federated learning. Specifically, multiple participated peers within one-hop are formed as a local ring to adapt to device dynamics (i.e., 'churns') and carry out local cooperation shuffling; an n-ary convergecast tree is formed from local rings to the aggregator to assure the communication efficiency. Theoretical analysis shows the superiority of the proposed hybrid (R+T) convergecast design in terms of system latency as compared to existing topologies. Prototype-based simulation on CloudLab shows that the hybrid (R+T) design is able to reduce the rounds of iterations while achieving the best model accuracy under system 'churns' as compared to the state of the art.

Original languageEnglish
Title of host publication2021 IEEE 18th Annual Consumer Communications and Networking Conference, CCNC 2021
ISBN (Electronic)9781728197944
DOIs
StatePublished - 9 Jan 2021
Event18th IEEE Annual Consumer Communications and Networking Conference, CCNC 2021 - Virtual, Las Vegas, United States
Duration: 9 Jan 202113 Jan 2021

Publication series

Name2021 IEEE 18th Annual Consumer Communications and Networking Conference, CCNC 2021

Conference

Conference18th IEEE Annual Consumer Communications and Networking Conference, CCNC 2021
Country/TerritoryUnited States
CityVirtual, Las Vegas
Period9/01/2113/01/21

Keywords

  • CloudLab
  • Convergecast
  • Federated Learning
  • N-ary Tree
  • Parallel Machine Learning
  • Ring

Fingerprint

Dive into the research topics of 'Efficient parameter aggregation in federated learning with hybrid convergecast'. Together they form a unique fingerprint.

Cite this