TY - JOUR
T1 - Siren+
T2 - Robust Federated Learning with Proactive Alarming and Differential Privacy
AU - Guo, Hanxi
AU - Wang, Hao
AU - Song, Tao
AU - Hua, Yang
AU - Ma, Ruhui
AU - Jin, Xiulang
AU - Xue, Zhengui
AU - Guan, Haibing
N1 - Publisher Copyright:
© 2024 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
PY - 2024
Y1 - 2024
N2 - Federated learning (FL), an emerging machine learning paradigm that trains a global model across distributed clients without violating data privacy, has recently attracted significant attention. However, FL's distributed nature and iterative training extensively increase the attacking surface for Byzantine and inference attacks. Existing FL defense methods can hardly protect FL from both Byzantine and inference attacks due to their fundamental conflicts. The noise injected to defend against inference attacks interferes with model weights and training data, obscuring model analysis that Byzantine-robust methods utilize to detect attacks. Besides, the practicability of existing Byzantine-robust methods is limited since they heavily rely on model analysis. In this article, we present Siren+, a new robust FL system that defends against a wide spectrum of Byzantine attacks and inference attacks by jointly utilizing a proactive alarming mechanism and local differential privacy (LDP). The proactive alarming mechanism orchestrates clients and the FL server to collaboratively detect attacks using distributed alarms, which are free from the noise interference injected by LDP. Compared with the state-of-the-art defense methods, Siren+ can protect FL from Byzantine and inference attacks from a higher proportion of malicious clients in the system while keeping the global model performing normally. Extensive experiments with diverse settings and attacks on real-world datasets show that Siren+ outperforms existing defense methods when attacked by Byzantine and inference attacks.
AB - Federated learning (FL), an emerging machine learning paradigm that trains a global model across distributed clients without violating data privacy, has recently attracted significant attention. However, FL's distributed nature and iterative training extensively increase the attacking surface for Byzantine and inference attacks. Existing FL defense methods can hardly protect FL from both Byzantine and inference attacks due to their fundamental conflicts. The noise injected to defend against inference attacks interferes with model weights and training data, obscuring model analysis that Byzantine-robust methods utilize to detect attacks. Besides, the practicability of existing Byzantine-robust methods is limited since they heavily rely on model analysis. In this article, we present Siren+, a new robust FL system that defends against a wide spectrum of Byzantine attacks and inference attacks by jointly utilizing a proactive alarming mechanism and local differential privacy (LDP). The proactive alarming mechanism orchestrates clients and the FL server to collaboratively detect attacks using distributed alarms, which are free from the noise interference injected by LDP. Compared with the state-of-the-art defense methods, Siren+ can protect FL from Byzantine and inference attacks from a higher proportion of malicious clients in the system while keeping the global model performing normally. Extensive experiments with diverse settings and attacks on real-world datasets show that Siren+ outperforms existing defense methods when attacked by Byzantine and inference attacks.
KW - attack-agnostic defense system
KW - Byzantine-robust
KW - differential privacy
KW - Federated learning
UR - http://www.scopus.com/inward/record.url?scp=85184824443&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85184824443&partnerID=8YFLogxK
U2 - 10.1109/TDSC.2024.3362534
DO - 10.1109/TDSC.2024.3362534
M3 - Article
AN - SCOPUS:85184824443
SN - 1545-5971
VL - 21
SP - 4843
EP - 4860
JO - IEEE Transactions on Dependable and Secure Computing
JF - IEEE Transactions on Dependable and Secure Computing
IS - 5
ER -