Skip to main navigation Skip to search Skip to main content

Robust neural tracking of linguistic units relates to distractor suppression

  • Yayue Gao*
  • , Jianfeng Zhang
  • , Qian Wang
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

In a complex auditory scene, speech comprehension involves several stages: for example segregating the target from the background, recognizing syllables and integrating syllables into linguistic units (e.g., words). Although speech segregation is robust as shown by invariant neural tracking to target speech envelope, whether neural tracking to linguistic units is also robust and how this robustness is achieved remain unknown. To investigate these questions, we concurrently recorded neural responses tracking a rhythmic speech stream at its syllabic and word rates, using electroencephalography. Human participants listened to that target speech under a speech or noise distractor at varying signal-to-noise ratios. Neural tracking at the word rate was not as robust as neural tracking at the syllabic rate. Robust neural tracking to target's words was only observed under the speech distractor but not under the noise distractor. Moreover, this robust word tracking correlated with a successful suppression of distractor tracking. Critically, both word tracking and distractor suppression correlated with behavioural comprehension accuracy. In sum, our results suggest that a robust neural tracking of higher-level linguistic units relates to not only the target tracking, but also the distractor suppression.

Original languageEnglish
Pages (from-to)641-650
Number of pages10
JournalEuropean Journal of Neuroscience
Volume51
Issue number2
DOIs
StatePublished - 1 Jan 2020

Keywords

  • attention
  • auditory scene analysis
  • hierarchical linguistic structures
  • neural tracking

Fingerprint

Dive into the research topics of 'Robust neural tracking of linguistic units relates to distractor suppression'. Together they form a unique fingerprint.

Cite this