跳到主要导航 跳到搜索 跳到主要内容

Interpretable Mixture of Experts for Decomposition Network on Server Performance Metrics Forecasting

  • Fang Peng
  • , Xin Ji
  • , Le Zhang
  • , Junle Wang
  • , Kui Zhang
  • , Wenjun Wu*
  • *此作品的通讯作者

科研成果: 期刊稿件文章同行评审

摘要

The accurate forecasting of server performance metrics, such as CPU utilization, memory usage, and network bandwidth, is critical for optimizing resource allocation and ensuring system reliability in large-scale computing environments. In this paper, we introduce the Mixture of Experts for Decomposition Kolmogorov–Arnold Network (MOE-KAN), a novel approach designed to improve both the accuracy and interpretability of server performance prediction. The MOE-KAN framework employs a decomposition strategy that breaks down complex, nonlinear server performance patterns into simpler, more interpretable components, facilitating a clearer understanding of how predictions are made. By leveraging a Mixture of Experts (MOE) model, trend and residual components are learned by specialized experts, whose outputs are transparently combined to form the final prediction. The Kolmogorov–Arnold Network further enhances the model’s ability to capture intricate input–output relationships while maintaining transparency in its decision-making process. Experimental results on real-world server performance datasets demonstrate that MOE-KAN not only outperforms traditional models in terms of accuracy but also provides a more trustworthy and interpretable forecasting framework. This makes it particularly suitable for real-time server management and capacity planning, offering both reliability and interpretability in predictive models.

源语言英语
文章编号4116
期刊Electronics (Switzerland)
13
20
DOI
出版状态已出版 - 10月 2024

指纹

探究 'Interpretable Mixture of Experts for Decomposition Network on Server Performance Metrics Forecasting' 的科研主题。它们共同构成独一无二的指纹。

引用此