TY - GEN
T1 - FedCP
T2 - 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2023
AU - Zhang, Jianqing
AU - Hua, Yang
AU - Wang, Hao
AU - Song, Tao
AU - Xue, Zhengui
AU - Ma, Ruhui
AU - Guan, Haibing
N1 - Publisher Copyright:
© 2023 ACM.
PY - 2023/8/4
Y1 - 2023/8/4
N2 - Recently, personalized federated learning (pFL) has attracted increasing attention in privacy protection, collaborative learning, and tackling statistical heterogeneity among clients, e.g., hospitals, mobile smartphones, etc. Most existing pFL methods focus on exploiting the global information and personalized information in the client-level model parameters while neglecting that data is the source of these two kinds of information. To address this, we propose the Federated Conditional Policy (FedCP) method, which generates a conditional policy for each sample to separate the global information and personalized information in its features and then processes them by a global head and a personalized head, respectively. FedCP is more fine-grained to consider personalization in a sample-specific manner than existing pFL methods. Extensive experiments in computer vision and natural language processing domains show that FedCP outperforms eleven state-of-the-art methods by up to 6.69%. Furthermore, FedCP maintains its superiority when some clients accidentally drop out, which frequently happens in mobile settings. Our code is public at https://github.com/TsingZ0/FedCP.
AB - Recently, personalized federated learning (pFL) has attracted increasing attention in privacy protection, collaborative learning, and tackling statistical heterogeneity among clients, e.g., hospitals, mobile smartphones, etc. Most existing pFL methods focus on exploiting the global information and personalized information in the client-level model parameters while neglecting that data is the source of these two kinds of information. To address this, we propose the Federated Conditional Policy (FedCP) method, which generates a conditional policy for each sample to separate the global information and personalized information in its features and then processes them by a global head and a personalized head, respectively. FedCP is more fine-grained to consider personalization in a sample-specific manner than existing pFL methods. Extensive experiments in computer vision and natural language processing domains show that FedCP outperforms eleven state-of-the-art methods by up to 6.69%. Furthermore, FedCP maintains its superiority when some clients accidentally drop out, which frequently happens in mobile settings. Our code is public at https://github.com/TsingZ0/FedCP.
KW - conditional computing
KW - feature separation
KW - federated learning
KW - personalization
KW - statistical heterogeneity
UR - http://www.scopus.com/inward/record.url?scp=85171345096&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85171345096&partnerID=8YFLogxK
U2 - 10.1145/3580305.3599345
DO - 10.1145/3580305.3599345
M3 - Conference contribution
AN - SCOPUS:85171345096
T3 - Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
SP - 3249
EP - 3261
BT - KDD 2023 - Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
Y2 - 6 August 2023 through 10 August 2023
ER -