Online Federated Reproduced Gradient Descent With Time-Varying Global Optima

Research output: Contribution to journalArticlepeer-review

Abstract

This paper addresses an online federated learning problem, where the time drift in data distribution leads to time-varying global optima. To adapt to the drift, this paper designs a random Fourier features (RFF) model combined with Reproducing Kernel Hilbert Space (RKHS) theory to tracking the global gradient. Meanwhile, the model also can mitigate gradient variance from local data and gradient bias due to data heterogeneity. Based on this model, the paper further proposes an online federated reproduced gradient descent (OFedRGD) algorithm. The Wasserstein distance is then employed as a distribution metric to analyze the regret by OFedRGD, which is composed of cumulative distribution drifts and cumulative gradient error caused by stochasticity and heterogeneity. Additionally, a set of CLEAR-datasets, including two online learning tasks, are used to test the proposed algorithm. The results show that the proposed algorithm can effectively improve classification accuracy in the two tasks by 5\% and 16\%, respectively, and its performance is less adversely affected by the degree of data dispersion.

Original languageEnglish
Pages (from-to)1379-1393
Number of pages15
JournalIEEE Transactions on Signal Processing
Volume73
DOIs
StatePublished - 2025

Keywords

  • Online federated learning
  • data heterogeneity
  • random fourier feature
  • reproduced gradient
  • reproducing kernel hilbert space
  • time drift, gradient variance

Fingerprint

Dive into the research topics of 'Online Federated Reproduced Gradient Descent With Time-Varying Global Optima'. Together they form a unique fingerprint.

Cite this