Skip to main navigation Skip to search Skip to main content

“How Do You Understand? Your Eyes Show It”: Explainable Artificial Intelligence for Cross-Language Comprehension Prediction Through Eye Movement

  • Entong Gao
  • , Hanyu Zhong
  • , Ruiqing Yuan
  • , Jialu Guo
  • , Zhe Chen*
  • *Corresponding author for this work
  • Beihang University
  • Peking University

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Eye movements have long been linked to comprehension performance, serving as a valuable window into cognitive processing in human computer interaction (HCI). This research investigates the potential of explainable artificial intelligence (XAI) to predict comprehension based on eye movement across native and nonnative language scenarios. Study 1 applies deep learning models to prediction, with study 2 utilizing SHapley Additive exPlanations (SHAP) for model interpretability and conducting experiments with AI agents to optimize interaction strategies based on predicted comprehension levels in study 3. The findings reveal: 1) Transformer model outperforms other models in predicting comprehension, with intelligibility predictions being more accurate than comprehensibility predictions, particularly in native scenarios. 2) In native scenarios, comprehension is closely linked to early eye movement activities, particularly with blink activities, while nonnative comprehension relies more on later-stage processing, reflecting the increased cognitive demands of processing nonnative language. 3) In nonnative environments, Reward Factor strategies are crucial for alleviating cognitive load and enhancing user engagement, compared to native contexts. The research provides a novel approach by integrating eye movement with XAI and agents experiments, revealing key eye movement features that correspond comprehension and exploring how AI agents can tailor interaction strategies based on comprehension levels. This study highlights the potential for AI to improve user interaction by dynamically adjusting to comprehension levels, particularly in multilingual contexts, offering practical implications for personalized information system and HCI.

Original languageEnglish
Title of host publicationCross-Cultural Design - 17th International Conference, CCD 2025, Held as Part of the 27th HCI International Conference, HCII 2025, Proceedings
EditorsPei-Luen Patrick Rau
PublisherSpringer Science and Business Media Deutschland GmbH
Pages323-348
Number of pages26
ISBN (Print)9783031937323
DOIs
StatePublished - 2025
Event17th International Conference on Cross-Cultural Design, CCD 2025, held as part of the 27th HCI International Conference, HCII 2025 - Gothenburg, Sweden
Duration: 22 Jun 202527 Jun 2025

Publication series

NameLecture Notes in Computer Science
Volume15783 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference17th International Conference on Cross-Cultural Design, CCD 2025, held as part of the 27th HCI International Conference, HCII 2025
Country/TerritorySweden
CityGothenburg
Period22/06/2527/06/25

Keywords

  • Agent experiment
  • Comprehension
  • Cross language
  • Deep learning
  • explainable AI
  • Eye movement

Fingerprint

Dive into the research topics of '“How Do You Understand? Your Eyes Show It”: Explainable Artificial Intelligence for Cross-Language Comprehension Prediction Through Eye Movement'. Together they form a unique fingerprint.

Cite this