Abstract
Thanks to the Internet of Things (IoT), there has been explosive growth in edge devices, which generate a tremendous amount of data that holds invaluable potential. However, conventional data mining and machine learning (ML) paradigms require transmitting raw data to data centers for further use, which puts a heavy burden on communication networks and is exposed to high privacy risks. Federated learning allows for the training of ML models using distributed datasets, which can be applied to protect data privacy and alleviate transmission burdens. Meanwhile, the technique of over-the-air (OTA) computation can be utilized to exploit the superposition property of wireless communication channels. Motivated by this, in this paper, we propose a co-phase OTA approach for communication-efficient uploading in multi-server federated learning, which does not require expansion of the uplink channel bandwidth when the numbers of users and models increase. Besides, the digital OTA with randomized transmission is proposed to overcome the disadvantages of analog OTA, where the performance analyses of analog OTA and digital OTA are deduced, respectively. Simulation results show that a lower cost function can be obtained by digital OTA while requiring fewer iterations for convergence than that in analog OTA as more users can upload.
| Original language | English |
|---|---|
| Pages (from-to) | 10683-10695 |
| Number of pages | 13 |
| Journal | IEEE Transactions on Mobile Computing |
| Volume | 24 |
| Issue number | 10 |
| DOIs | |
| State | Published - 2025 |
Keywords
- Federated learning
- Internet-of-Things
- distributed stochastic gradient descent
- over-the-air (OTA) computation
Fingerprint
Dive into the research topics of 'Communication-Efficient Multi-Server Federated Learning via Over-the-Air Computation'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver