Learning gaze biases with head motion for head pose-free gaze estimation

  • Feng Lu*
  • , Takahiro Okabe
  • , Yusuke Sugano
  • , Yoichi Sato
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

When estimating human gaze directions from captured eye appearances, most existing methods assume a fixed head pose because head motion changes eye appearance greatly and makes the estimation inaccurate. To handle this difficult problem, in this paper, we propose a novel method that performs accurate gaze estimation without restricting the user's head motion. The key idea is to decompose the original free-head motion problem into subproblems, including an initial fixed head pose problem and subsequent compensations to correct the initial estimation biases. For the initial estimation, automatic image rectification and joint alignment with gaze estimation are introduced. Then compensations are done by either learning-based regression or geometric-based calculation. The merit of using such a compensation strategy is that the training requirement to allow head motion is not significantly increased; only capturing a 5-s video clip is required. Experiments are conducted, and the results show that our method achieves an average accuracy of around 3 by using only a single camera.

Original languageEnglish
Pages (from-to)169-179
Number of pages11
JournalImage and Vision Computing
Volume32
Issue number3
DOIs
StatePublished - Mar 2014
Externally publishedYes

Keywords

  • Appearance-based approach
  • Free head motion
  • Gaze estimation
  • Head pose compensation

Fingerprint

Dive into the research topics of 'Learning gaze biases with head motion for head pose-free gaze estimation'. Together they form a unique fingerprint.

Cite this