Adverse-pressure-gradient turbulence model improvement via mixing-length and symbolic regression

  • Han Qi Song
  • , Jin Rong Zhang
  • , Rui Jie Bai
  • , Ming Ze Ma
  • , Chao Yan*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

A novel theoretical formulation for the mixing length (luv+) in the logarithmic region under adverse pressure gradient (APG) conditions is developed, based on the logarithmic decay law of the total shear stress. Utilizing a symbolic regression (SR)-based machine learning algorithm, we establish a transition function for luv+ in the buffer layer under APG. The synergistic combination of these developments yields a comprehensive mathematical expression for the near-wall luv+, designated as luv_SR+. Extensive validation against multiple high-fidelity datasets demonstrates excellent agreement, confirming the universal applicability of luv_SR+. Subsequently, we derive a formulation for the turbulent eddy viscosity coefficient (νt+) under APG conditions building upon luv_SR+, which reveals inherent limitations in the νt+ calculation of the Menter shear stress transport (SST) turbulence model. To address these limitations, we propose the LSR SST model. Extensive validation through six flow separation cases demonstrates that the LSR SST model significantly improves the prediction accuracy of the wall friction coefficient (Cf) distribution and the separation point compared to the original Menter SST model. The results conclusively show that the LSR SST model effectively resolves the premature separation issue inherent in the original Menter SST model.

Original languageEnglish
Article number110993
JournalAerospace Science and Technology
Volume168
DOIs
StatePublished - Jan 2026

Keywords

  • Adverse pressure gradient
  • Machine learning
  • Mixing length
  • Symbolic regression
  • Turbulence model

Fingerprint

Dive into the research topics of 'Adverse-pressure-gradient turbulence model improvement via mixing-length and symbolic regression'. Together they form a unique fingerprint.

Cite this