Abstract
Most existing semi-parameter-sharing federated learning (FL) frameworks utilize generative models to achieve partial parameter sharing with the server, which effectively enhances the data privacy of each client. However, these generative models often suffer from model utility degradation due to poor representation robustness. Meanwhile, representation inconsistency between local and global models exacerbates the client drift problem under non-IID scenarios. Furthermore, existing semi-parameter-sharing FL frameworks overlook representation leakage risks associated with generator sharing, while failing to balance privacy and utility. To alleviate these challenges, we propose FedPDM, a semi-parameter-sharing FL framework built upon a privacy-preserving diffusion model (PDM). Specifically, our proposed PDM enables model alignment with features from the privacy extractor without requiring direct exposure of this extractor, effectively mitigating utility degradation caused by poor representation robustness. Moreover, a feature-level penalty term is introduced into the optimization objective of PDM to avoid representation leakage. We further design a two-stage aggregation strategy that addresses representation inconsistency through initialization correction with a Gaussian constraint for knowledge distillation. Finally, we provide the first theoretical convergence analysis for semi-parameter-sharing FL, demonstrating that our framework converges at a rate of O(1/T). Extensive experiments on four datasets show that FedPDM achieves average accuracy improvements of 1.78% to 5.56% compared with various state-of-the-art baselines.
| Original language | English |
|---|---|
| Article number | 115452 |
| Journal | Knowledge-Based Systems |
| Volume | 338 |
| DOIs | |
| State | Published - 8 Apr 2026 |
Keywords
- Diffusion model
- Federated learning
- Privacy protection
- Split learning
Fingerprint
Dive into the research topics of 'FedPDM: Representation enhanced federated learning with privacy preserving diffusion models'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver