FedSlowdown: Efficiency Attacks Against Federated Learning of Adaptive Neural Networks

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The increasing computational and energy demands of deep neural networks (DNNs) have sparked significant interest in input-adaptive multi-exit architectures, known as Adaptive Neural Networks (AdNNs). AdNNs can significantly reduce inference time and energy usage by dynamically scaling the depth of computation, helping enable real-time medical image analysis and computer-aided diagnosis on resource-constrained devices and low latency medical applications. This is critical for portable ultrasound scanners and mobile diagnostic tools that operate in remote settings such as rural clinics, disaster-relief zones, or developing countries where access to healthcare is severely limited due to financial, infrastructural, and personnel constraints. Medical imaging data is often highly sensitive, necessitating strict privacy protections to comply with healthcare regulations (e.g., HIPAA, GDPR). Federated learning (FL) offers a privacy-preserving solution for training medical deep learning models across decentralized clinical sites without sharing raw patient data. While this makes training AdNNs in a FL environment a natural fit for medical data, we show that AdNNs trained under federated learning are vulnerable to efficiency attacks. Specifically, we introduce FedSlowdown, an attack that allows one or more participating clients to maliciously degrade the computational efficiency of the global AdNN model. This increases inference time and device energy usage, making real-time medical image analysis slower. We evaluate FedSlowdown across four AdNN architectures and two medical imaging datasets (HAM10000, Fed-ISIC). Our results show that FedSlowdown can reduce AdNN efficiency by up to 90–100%, with 1.5–5× longer inference times on devices deployed in constrained environments.

Original languageEnglish
Title of host publicationBridging Regulatory Science and Medical Imaging Evaluation; and Distributed, Collaborative, and Federated Learning - 1st International Workshop, BRIDGE 2025, and 6th International Workshop, DeCaF 2025, Held in Conjunction with MICCAI 2025, Proceedings
EditorsGhada Zamzmi, Annika Reinke, Ravi Samala, Meirui Jiang, Xiaoxiao Li, Holger Roth, Mariia Sidulova, Thijs Kooi, Shadi Albarqouni, Spyridon Bakas, Nicola Rieke
Pages153-163
Number of pages11
DOIs
StatePublished - 2026
Event1st International Workshop on Bridging Regulatory Science and Medical Imaging Evaluation, BRIDGE 2025 and 6th MICCAI Workshop on Distributed, Collaborative and Federated Learning, DeCaF 2025, Held in Conjunction with 28th International conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2025 - Daejeon, Korea, Republic of
Duration: 23 Sep 202527 Sep 2025

Publication series

NameLecture Notes in Computer Science
Volume16135 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference1st International Workshop on Bridging Regulatory Science and Medical Imaging Evaluation, BRIDGE 2025 and 6th MICCAI Workshop on Distributed, Collaborative and Federated Learning, DeCaF 2025, Held in Conjunction with 28th International conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2025
Country/TerritoryKorea, Republic of
CityDaejeon
Period23/09/2527/09/25

Keywords

  • Adaptive Neural Networks
  • Efficiency Attacks
  • Federated Learning

Fingerprint

Dive into the research topics of 'FedSlowdown: Efficiency Attacks Against Federated Learning of Adaptive Neural Networks'. Together they form a unique fingerprint.

Cite this