TY - JOUR
T1 - ESFLM
T2 - Efficient and Secure Federated Learning Model with Homomorphic Encryption
AU - Li, Yang
AU - Xia, Chunhe
AU - Li, Chang
AU - Li, Xiaojian
AU - Wang, Tianbo
N1 - Publisher Copyright:
Copyright © 2025. Published by Elsevier Ltd.
PY - 2026/1
Y1 - 2026/1
N2 - Federated learning based on homomorphic encryption has attracted widespread attention for its strong security and enhanced protection of user data privacy. However, the nature of encrypted computation introduces three major challenges: computation efficiency, attack defense, and contribution assessment. The first concerns the efficiency of encrypted computation during model aggregation, the second involves defense malicious attacks under encryption, and the third addresses the fairness of contribution assessment for encrypted local models. This paper presents an Efficient and Secure Federated Learning Model with Homomorphic Encryption (ESFLM) to protect model privacy and tackle the aforementioned challenges. First, we leverage multiple nodes to perform parallel aggregation of local models, thereby improving the efficiency of encrypted model aggregation. Second, we introduce trusted supervise nodes to inspect local models when the global model is under attack, enabling effective defense of malicious behavior under homomorphic encryption. Finally, we fairly reward local training nodes based on their verified training time, even when local models remain encrypted. Experiments on three real-world datasets demonstrate that our model significantly outperforms baseline approaches in terms of both efficiency and security.
AB - Federated learning based on homomorphic encryption has attracted widespread attention for its strong security and enhanced protection of user data privacy. However, the nature of encrypted computation introduces three major challenges: computation efficiency, attack defense, and contribution assessment. The first concerns the efficiency of encrypted computation during model aggregation, the second involves defense malicious attacks under encryption, and the third addresses the fairness of contribution assessment for encrypted local models. This paper presents an Efficient and Secure Federated Learning Model with Homomorphic Encryption (ESFLM) to protect model privacy and tackle the aforementioned challenges. First, we leverage multiple nodes to perform parallel aggregation of local models, thereby improving the efficiency of encrypted model aggregation. Second, we introduce trusted supervise nodes to inspect local models when the global model is under attack, enabling effective defense of malicious behavior under homomorphic encryption. Finally, we fairly reward local training nodes based on their verified training time, even when local models remain encrypted. Experiments on three real-world datasets demonstrate that our model significantly outperforms baseline approaches in terms of both efficiency and security.
KW - Federated learning
KW - Homomorphic encryption
KW - Secret sharing
UR - https://www.scopus.com/pages/publications/105022810199
U2 - 10.1016/j.aei.2025.104118
DO - 10.1016/j.aei.2025.104118
M3 - 文章
AN - SCOPUS:105022810199
SN - 1474-0346
VL - 69
JO - Advanced Engineering Informatics
JF - Advanced Engineering Informatics
M1 - 104118
ER -