Ethnicity classification based on a hierarchical fusion

  • De Zhang*
  • , Yunhong Wang
  • , Zhaoxiang Zhang
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In this paper, we propose a cascaded multimodal biometrics system involving a fusion of frontal face and lateral gait, for the specific problem of ethnicity classification. This system performs human ethnicity classification first from the cues of gait recorded by a long-distance camera and requires next classification using facial images captured by a short-distance camera only when gait based ethnicity identification fails. For gait, we use Gait Energy Image (GEI), a spatio-temporal compact representation of gait in video, to characterize human walking properties. For face, we extract the well-known Gabor feature to render the effective facial appearance information. Experimental results obtained from a database of 22 subjects containing 12 East-Asian and 10 South-American shows that this cascaded system is capable of providing competitive discriminative power on ethnicity with a correct classification rate over 95%.

Original languageEnglish
Title of host publicationBiometric Recognition - 7th Chinese Conference, CCBR 2012, Proceedings
Pages300-307
Number of pages8
DOIs
StatePublished - 2012
Event7th Chinese Conference on Biometric Recognition, CCBR 2012 - Guangzhou, China
Duration: 4 Dec 20125 Dec 2012

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume7701 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference7th Chinese Conference on Biometric Recognition, CCBR 2012
Country/TerritoryChina
CityGuangzhou
Period4/12/125/12/12

Keywords

  • Ethnicity
  • Face
  • Fusion
  • Gait

Fingerprint

Dive into the research topics of 'Ethnicity classification based on a hierarchical fusion'. Together they form a unique fingerprint.

Cite this