Abstract
Federated learning based on homomorphic encryption has attracted widespread attention for its strong security and enhanced protection of user data privacy. However, the nature of encrypted computation introduces three major challenges: computation efficiency, attack defense, and contribution assessment. The first concerns the efficiency of encrypted computation during model aggregation, the second involves defense malicious attacks under encryption, and the third addresses the fairness of contribution assessment for encrypted local models. This paper presents an Efficient and Secure Federated Learning Model with Homomorphic Encryption (ESFLM) to protect model privacy and tackle the aforementioned challenges. First, we leverage multiple nodes to perform parallel aggregation of local models, thereby improving the efficiency of encrypted model aggregation. Second, we introduce trusted supervise nodes to inspect local models when the global model is under attack, enabling effective defense of malicious behavior under homomorphic encryption. Finally, we fairly reward local training nodes based on their verified training time, even when local models remain encrypted. Experiments on three real-world datasets demonstrate that our model significantly outperforms baseline approaches in terms of both efficiency and security.
| Original language | English |
|---|---|
| Article number | 104118 |
| Journal | Advanced Engineering Informatics |
| Volume | 69 |
| DOIs | |
| State | Published - Jan 2026 |
Keywords
- Federated learning
- Homomorphic encryption
- Secret sharing
Fingerprint
Dive into the research topics of 'ESFLM: Efficient and Secure Federated Learning Model with Homomorphic Encryption'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver