TY - GEN
T1 - FedSlowdown
T2 - 1st International Workshop on Bridging Regulatory Science and Medical Imaging Evaluation, BRIDGE 2025 and 6th MICCAI Workshop on Distributed, Collaborative and Federated Learning, DeCaF 2025, Held in Conjunction with 28th International conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2025
AU - Akinsanya, Ayomide
AU - Brennan, Tegan
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2026.
PY - 2026
Y1 - 2026
N2 - The increasing computational and energy demands of deep neural networks (DNNs) have sparked significant interest in input-adaptive multi-exit architectures, known as Adaptive Neural Networks (AdNNs). AdNNs can significantly reduce inference time and energy usage by dynamically scaling the depth of computation, helping enable real-time medical image analysis and computer-aided diagnosis on resource-constrained devices and low latency medical applications. This is critical for portable ultrasound scanners and mobile diagnostic tools that operate in remote settings such as rural clinics, disaster-relief zones, or developing countries where access to healthcare is severely limited due to financial, infrastructural, and personnel constraints. Medical imaging data is often highly sensitive, necessitating strict privacy protections to comply with healthcare regulations (e.g., HIPAA, GDPR). Federated learning (FL) offers a privacy-preserving solution for training medical deep learning models across decentralized clinical sites without sharing raw patient data. While this makes training AdNNs in a FL environment a natural fit for medical data, we show that AdNNs trained under federated learning are vulnerable to efficiency attacks. Specifically, we introduce FedSlowdown, an attack that allows one or more participating clients to maliciously degrade the computational efficiency of the global AdNN model. This increases inference time and device energy usage, making real-time medical image analysis slower. We evaluate FedSlowdown across four AdNN architectures and two medical imaging datasets (HAM10000, Fed-ISIC). Our results show that FedSlowdown can reduce AdNN efficiency by up to 90–100%, with 1.5–5× longer inference times on devices deployed in constrained environments.
AB - The increasing computational and energy demands of deep neural networks (DNNs) have sparked significant interest in input-adaptive multi-exit architectures, known as Adaptive Neural Networks (AdNNs). AdNNs can significantly reduce inference time and energy usage by dynamically scaling the depth of computation, helping enable real-time medical image analysis and computer-aided diagnosis on resource-constrained devices and low latency medical applications. This is critical for portable ultrasound scanners and mobile diagnostic tools that operate in remote settings such as rural clinics, disaster-relief zones, or developing countries where access to healthcare is severely limited due to financial, infrastructural, and personnel constraints. Medical imaging data is often highly sensitive, necessitating strict privacy protections to comply with healthcare regulations (e.g., HIPAA, GDPR). Federated learning (FL) offers a privacy-preserving solution for training medical deep learning models across decentralized clinical sites without sharing raw patient data. While this makes training AdNNs in a FL environment a natural fit for medical data, we show that AdNNs trained under federated learning are vulnerable to efficiency attacks. Specifically, we introduce FedSlowdown, an attack that allows one or more participating clients to maliciously degrade the computational efficiency of the global AdNN model. This increases inference time and device energy usage, making real-time medical image analysis slower. We evaluate FedSlowdown across four AdNN architectures and two medical imaging datasets (HAM10000, Fed-ISIC). Our results show that FedSlowdown can reduce AdNN efficiency by up to 90–100%, with 1.5–5× longer inference times on devices deployed in constrained environments.
KW - Adaptive Neural Networks
KW - Efficiency Attacks
KW - Federated Learning
UR - https://www.scopus.com/pages/publications/105018304751
UR - https://www.scopus.com/pages/publications/105018304751#tab=citedBy
U2 - 10.1007/978-3-032-05663-4_15
DO - 10.1007/978-3-032-05663-4_15
M3 - Conference contribution
AN - SCOPUS:105018304751
SN - 9783032056658
T3 - Lecture Notes in Computer Science
SP - 153
EP - 163
BT - Bridging Regulatory Science and Medical Imaging Evaluation; and Distributed, Collaborative, and Federated Learning - 1st International Workshop, BRIDGE 2025, and 6th International Workshop, DeCaF 2025, Held in Conjunction with MICCAI 2025, Proceedings
A2 - Zamzmi, Ghada
A2 - Reinke, Annika
A2 - Samala, Ravi
A2 - Jiang, Meirui
A2 - Li, Xiaoxiao
A2 - Roth, Holger
A2 - Sidulova, Mariia
A2 - Kooi, Thijs
A2 - Albarqouni, Shadi
A2 - Bakas, Spyridon
A2 - Rieke, Nicola
Y2 - 23 September 2025 through 27 September 2025
ER -