Fitness-Driven Evolutionary Federated Learning: Adaptive Client Selection and Dynamic Population for Communication Efficiency

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Federated Learning (FL) enables decentralized nodes to collaboratively train models while maintaining data privacy. However, traditional FL methods often face significant challenges, including high communication overhead and slow convergence, especially when data across nodes is heterogeneous. To address these issues, this paper introduces a novel framework named Fitness-Driven Evolutionary Federated Learning (FD-EFL), which effectively combines evolutionary strategies (ES) with adaptive client selection and dynamic population adjustment. The primary innovation of FD-EFL is its fitness-driven information-sharing approach, wherein clients communicate only concise fitness metrics representing similarities between their local models and a noise-perturbed global population. This significantly reduces communication overhead. FD-EFL further enhances performance by adaptively selecting high-quality clients, effectively minimizing the impact of noisy or low-quality updates. Additionally, the framework integrates a dynamic population adjustment mechanism guided by the Critical Learning Period (CLP), dynamically expanding the population size during critical training phases to improve model accuracy, and shrinking it during non-critical phases to save communication resources. Experimental evaluations demonstrate that FD-EFL substantially lowers communication costs without compromising model accuracy, achieving comparable performance to established methods like FedAvg. Our proposed framework offers a practical and efficient solution for federated learning in heterogeneous data environments. The implementation is publicly available at: https://github.com/buaaYYC/Fed-FEL.

Original languageEnglish
Title of host publicationNeural Information Processing - 32nd International Conference, ICONIP 2025, Proceedings
EditorsTadahiro Taniguchi, Chi Sing Andrew Leung, Tadashi Kozuno, Junichiro Yoshimoto, Mufti Mahmud, Maryam Doborjeh, Kenji Doya
PublisherSpringer Science and Business Media Deutschland GmbH
Pages250-264
Number of pages15
ISBN (Print)9789819543830
DOIs
StatePublished - 2026
Event32nd International Conference on Neural Information Processing, ICONIP 2025 - Okinawa, Japan
Duration: 20 Nov 202524 Nov 2025

Publication series

NameLecture Notes in Computer Science
Volume16312 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference32nd International Conference on Neural Information Processing, ICONIP 2025
Country/TerritoryJapan
CityOkinawa
Period20/11/2524/11/25

Keywords

  • Client Selection
  • Communication Efficiency
  • Evolutionary Strategies
  • Federated Learning
  • Heterogeneous Data

Fingerprint

Dive into the research topics of 'Fitness-Driven Evolutionary Federated Learning: Adaptive Client Selection and Dynamic Population for Communication Efficiency'. Together they form a unique fingerprint.

Cite this