A robust and fast monocular-vision-based hand tracking method for virtual touch screen

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

An articulated hand tracking method which can be used in virtual touch screen system (VTSS) using monocular camera is proposed. The VTSS is a novel vision-based interactive surface without the need for physical touch screen. The articulated hand model provides more information than fingertips to be in touch with the interactive surface. Using a simplified kinematic model of the hand, a rough estimation for automatic initialization is obtained using the fingertips' orientation and position under hypothesis of open hand. Then pose of the palm is refined and updated using line features by Gauss-Newton method and re-computation of fingers' pose based on particle filter followed. Experiments show the potential of the proposed method in applications embedded in VTSS.

Original languageEnglish
Title of host publicationProceedings of the 2009 2nd International Congress on Image and Signal Processing, CISP'09
DOIs
StatePublished - 2009
Event2009 2nd International Congress on Image and Signal Processing, CISP'09 - Tianjin, China
Duration: 17 Oct 200919 Oct 2009

Publication series

NameProceedings of the 2009 2nd International Congress on Image and Signal Processing, CISP'09

Conference

Conference2009 2nd International Congress on Image and Signal Processing, CISP'09
Country/TerritoryChina
CityTianjin
Period17/10/0919/10/09

Keywords

  • Hand tracking
  • Image understanding
  • Nonlinear least-squares (NLLS)
  • Virtual touch screen

Fingerprint

Dive into the research topics of 'A robust and fast monocular-vision-based hand tracking method for virtual touch screen'. Together they form a unique fingerprint.

Cite this