Group Perception Based Self-adaptive Fusion Tracking

  • Yiyang Xing*
  • , Shuai Wang
  • , Yang Zhang
  • , Shuangye Zhao
  • , Yubin Wu
  • , Jiahao Shen
  • , Hao Sheng
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Multi-object tracking (MOT) is an important and representative task in the field of computer vision, while tracking-by-detection is the most mainstream paradigm for MOT, so that target detection quality, feature representation ability, and association algorithm greatly affect tracking performance. On the one hand, multiple pedestrians moving together in the same group maintain similar motion pattern, so that they can indicate each other’s moving state. We extract groups from detections and maintain the group relationship of trajectories in tracking. We propose a state transition mechanism to smooth detection bias, recover missing detection and confront false detection. We also build a two-level group-detection association algorithm, which improves the accuracy of association. On the other hand, different areas of the tracking scene have diverse and varying impact on the detections’ appearance feature, which weakens the appearance feature’s representation ability. We propose a self-adaptive feature fusion strategy based on the tracking scene and the group structure, which can help us to get fusion feature with stronger representative ability to use in the trajectory-detection association to improve tracking performance. To summary, in this paper, we propose a novel Group Perception based Self-adaptive Fusion Tracking (GST) framework, including Group concept and Group Exploration Net, Group Perception based State Transition Mechanism, and Self-adaptive Feature Fusion Strategy. Experiments on the MOT17 dataset demonstrate the effectiveness of our method. The method achieves competitive results compared to the state-of-the-art methods.

Original languageEnglish
Title of host publicationAdvances in Computer Graphics - 40th Computer Graphics International Conference, CGI 2023, Proceedings
EditorsBin Sheng, Lei Bi, Jinman Kim, Nadia Magnenat-Thalmann, Daniel Thalmann
PublisherSpringer Science and Business Media Deutschland GmbH
Pages93-105
Number of pages13
ISBN (Print)9783031500770
DOIs
StatePublished - 2024
Event40th Computer Graphics International Conference, CGI 2023 - Shanghai, China
Duration: 28 Aug 20231 Sep 2023

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume14498 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference40th Computer Graphics International Conference, CGI 2023
Country/TerritoryChina
CityShanghai
Period28/08/231/09/23

Keywords

  • Feature fusion
  • Group perception
  • Multi-object tracking (MOT)
  • Self-adaptive

Fingerprint

Dive into the research topics of 'Group Perception Based Self-adaptive Fusion Tracking'. Together they form a unique fingerprint.

Cite this