Detecting dynamic human shadow based on low altitude moving platform by co-training

  • Zhigang Xie
  • , Shaoxing Hu
  • , Aiwu Zhang
  • , Weidong Sun*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

In the area of video-monitoring, background subtraction method is widely adopted to detect human shadows, but it is difficult to achieve dynamic detection for the low altitude moving platforms. In order to resolve the problem of dynamic human shadow detection onboard of low altitude moving platform, a novel method is proposed in this paper. In this method, according to the characteristics of the outdoor human shadow, three kinds of pixel based features are improved, a kind of area based feature named "luminance contrast" is proposed, and further more, an optimized combination of those features is given experimentally. Secondly, according to the independence between pixel based and area based features, a two-view classifier based on co-training theory and a semi-supervised training strategy for this classifier are established. And then, the random sampling theory is adopted for improving the training efficiency and the support vector machine has also been adopted to solve the problem of small samples learning. Some experimental results show that this proposed method is characterized by high shadow detection rate and fine robustness, can effectively solve the problem of dynamic human shadow detection for low altitude moving platforms.

Original languageEnglish
Pages (from-to)903-913
Number of pages11
JournalJisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics
Volume26
Issue number6
StatePublished - Jun 2014

Keywords

  • Co-training
  • Dynamic human shadow detection
  • Low altitude moving platform
  • Pixel and area based shadow features

Fingerprint

Dive into the research topics of 'Detecting dynamic human shadow based on low altitude moving platform by co-training'. Together they form a unique fingerprint.

Cite this