LLM-optimized wavelet packet transform for synchronous condenser fault prediction

  • Dongqing Zhang*
  • , Chaofeng Zhang
  • , Michel Kadoch
  • , Tao Hong
  • , Shenglong Li
  • , Wenqiang Zhao
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

This paper proposes an innovative approach for predicting faults in synchronous condensers in ultra-high voltage direct current (UHVDC) transmission systems. The framework combines Wavelet Packet Transform (WPT) for intelligent feature extraction with an enhanced Gated Recurrent Unit (GRU) network augmented by multi-head attention mechanisms. WPT is employed for efficient decomposition of fault signals into multiple frequency sub-bands, facilitating the extraction of fault features such as energy, entropy, and statistical moments. By applying Large Language Models (LLM) to WPT, an intelligent feature selection mechanism significantly improves both detection accuracy and processing efficiency. The Multi-Head Attention GRU (MHA-GRU) network architecture is designed to capture complex temporal dependencies in fault signals while maintaining computational efficiency. Comprehensive experimental results demonstrate that our framework consistently outperforms state-of-the-art methods across all performance metrics, including classification accuracy, detection time, and false alarm rate. The system exhibits robust stability under varying load conditions with particularly significant improvements in air-gap eccentricity fault detection. The proposed approach provides a reliable solution for early fault prediction in UHVDC synchronous condensers, enabling timely maintenance intervention before minor issues develop into critical failures.

Original languageEnglish
Article numbere0330429
JournalPLOS ONE
Volume20
Issue number8 August
DOIs
StatePublished - Aug 2025

Fingerprint

Dive into the research topics of 'LLM-optimized wavelet packet transform for synchronous condenser fault prediction'. Together they form a unique fingerprint.

Cite this