Abstract
Federated learning offers a framework for collaborative machine learning without compromising data privacy, an especially critical feature when dealing with sensitive graph-structured data in fields like social networks and healthcare. Despite recent advancements, traditional Federated Graph Neural Networks (FedGNNs) struggle to handle dynamic, heterogeneous graph structures and fail to capture long-range dependencies effectively. To address these limitations, we introduce a novel federated framework incorporating the Graph Transformer architecture: Federated Graph Propagation Transformer (FedGPTrans). FedGPTrans leverages a secure mixture graph smoothing mechanism and transformer-based attention to efficiently manage non-local interactions and varying graph topology, without exposing sensitive data. Experimental results on six benchmark datasets demonstrate that FedGPTrans outperforms state-of-the-art FedGNNs, achieving competitive accuracy and superior privacy preservation. Our method not only advances federated graph learning but also bridges the performance gap with centralized models while ensuring rich global graph instruction information fusing, making it a versatile solution for privacy-sensitive applications in decentralized environments.
| Original language | English |
|---|---|
| Article number | 102954 |
| Journal | Information Fusion |
| Volume | 118 |
| DOIs | |
| State | Published - Jun 2025 |
Keywords
- Data security
- Federated learning
- Graph learning
- Privacy preserving
Fingerprint
Dive into the research topics of 'Federated graph transformer with mixture attentions for secure graph knowledge fusions'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver