跳到主要导航 跳到搜索 跳到主要内容

H2Tune: Federated Foundation Model Fine-Tuning with Hybrid Heterogeneity

  • Wei Guo
  • , Siyuan Lu
  • , Yiqi Tong
  • , Zhaojun Hu
  • , Fuzhen Zhuang*
  • , Xiao Zhang*
  • , Tao Fan
  • , Jin Dong
  • *此作品的通讯作者
  • Beihang University
  • Heilongjiang University
  • Renmin University of China
  • Zhongguancun Laboratory
  • Shandong University
  • Ai Group
  • Beijing Academy of Blockchain and Edge Computing

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

Different from existing federated fine-tuning (FFT) methods for foundation models, hybrid heterogeneous federated fine-tuning (HHFFT) is an under-explored scenario where clients exhibit double heterogeneity in model architectures and downstream tasks. This hybrid heterogeneity introduces two significant challenges: 1) heterogeneous matrix aggregation, where clients adopt different large-scale foundation models based on their task requirements and resource limitations, leading to dimensional mismatches during LoRA parameter aggregation; and 2) multi-task knowledge interference, where local shared parameters, trained with both task-shared and task-specific knowledge, cannot ensure only task-shared knowledge is transferred between clients. To address these challenges, we propose H2Tune, a federated foundation model fine-tuning with hybrid heterogeneity. Our framework H2Tune consists of three key components: (i) sparsified triple matrix decomposition to align hidden dimensions across clients through constructing rank-consistent middle matrices, with adaptive sparsification based on client resources; (ii) relation-guided matrix layer alignment to handle heterogeneous layer structures and representation capabilities; and (iii) alternating task-knowledge disentanglement mechanism to decouple shared and specific knowledge of local model parameters through alternating optimization. Theoretical analysis proves a convergence rate of O(1/√T). Extensive experiments show our method achieves up to 15.4% accuracy improvement compared to state-of-the-art baselines.

源语言英语
主期刊名ECAI 2025 - 28th European Conference on Artificial Intelligence, including 14th Conference on Prestigious Applications of Intelligent Systems, PAIS 2025 - Proceedings
编辑Ines Lynce, Nello Murano, Mauro Vallati, Serena Villata, Federico Chesani, Michela Milano, Andrea Omicini, Mehdi Dastani
出版商IOS Press BV
3178-3185
页数8
ISBN(电子版)9781643686318
DOI
出版状态已出版 - 21 10月 2025
活动28th European Conference on Artificial Intelligence, ECAI 2025, including 14th Conference on Prestigious Applications of Intelligent Systems, PAIS 2025 - Bologna, 意大利
期限: 25 10月 202530 10月 2025

出版系列

姓名Frontiers in Artificial Intelligence and Applications
413
ISSN(印刷版)0922-6389
ISSN(电子版)1879-8314

会议

会议28th European Conference on Artificial Intelligence, ECAI 2025, including 14th Conference on Prestigious Applications of Intelligent Systems, PAIS 2025
国家/地区意大利
Bologna
时期25/10/2530/10/25

指纹

探究 'H2Tune: Federated Foundation Model Fine-Tuning with Hybrid Heterogeneity' 的科研主题。它们共同构成独一无二的指纹。

引用此