Human fatigue expression recognition through image-based dynamic multi-information and bimodal deep learning

  • Lei Zhao
  • , Zengcai Wang*
  • , Xiaojin Wang
  • , Yazhou Qi
  • , Qing Liu
  • , Guoxin Zhang
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Human fatigue is an important cause of traffic accidents. To improve the safety of transportation, we propose, in this paper, a framework for fatigue expression recognition using image-based facial dynamic multi-information and a bimodal deep neural network. First, the landmark of face region and the texture of eye region, which complement each other in fatigue expression recognition, are extracted from facial image sequences captured by a single camera. Then, two stacked autoencoder neural networks are trained for landmark and texture, respectively. Finally, the two trained neural networks are combined by learning a joint layer on top of them to construct a bimodal deep neural network. The model can be used to extract a unified representation that fuses landmark and texture modalities together and classify fatigue expressions accurately. The proposed system is tested on a human fatigue dataset obtained from an actual driving environment. The experimental results demonstrate that the proposed method performs stably and robustly, and that the average accuracy achieves 96.2%.

Original languageEnglish
Article number053024
JournalJournal of Electronic Imaging
Volume25
Issue number5
DOIs
StatePublished - 1 Sep 2016
Externally publishedYes

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 3 - Good Health and Well-being
    SDG 3 Good Health and Well-being

Keywords

  • bimodal learning
  • dynamic multi-information
  • fatigue expression recognition
  • landmark
  • texture

Fingerprint

Dive into the research topics of 'Human fatigue expression recognition through image-based dynamic multi-information and bimodal deep learning'. Together they form a unique fingerprint.

Cite this