跳到主要导航 跳到搜索 跳到主要内容

Beyond Triplet Loss: Person Re-Identification With Fine-Grained Difference-Aware Pairwise Loss

  • Cheng Yan*
  • , Guansong Pang
  • , Xiao Bai
  • , Changhong Liu
  • , Xin Ning
  • , Lin Gu
  • , Jun Zhou
  • *此作品的通讯作者
  • Beihang University
  • University of Adelaide
  • Jiangxi Normal University
  • CAS - Institute of Semiconductors
  • The University of Tokyo
  • Griffith University Queensland

科研成果: 期刊稿件文章同行评审

摘要

Person Re-IDentification (ReID) aims at re-identifying persons from different viewpoints across multiple cameras. Capturing the fine-grained appearance differences is often the key to accurate person ReID, because many identities can be differentiated only when looking into these fine-grained differences. However, most state-of-the-art person ReID approaches, typically driven by a triplet loss, fail to effectively learn the fine-grained features as they are focused more on differentiating large appearance differences. To address this issue, we introduce a novel pairwise loss function that enables ReID models to learn the fine-grained features by adaptively enforcing an exponential penalization on the images of small differences and a bounded penalization on the images of large differences. The proposed loss is generic and can be used as a plugin to replace the triplet loss to significantly enhance different types of state-of-the-art approaches. Experimental results on four benchmark datasets show that the proposed loss substantially outperforms a number of popular loss functions by large margins; and it also enables significantly improved data efficiency.

源语言英语
页(从-至)1665-1677
页数13
期刊IEEE Transactions on Multimedia
24
DOI
出版状态已出版 - 2022

指纹

探究 'Beyond Triplet Loss: Person Re-Identification With Fine-Grained Difference-Aware Pairwise Loss' 的科研主题。它们共同构成独一无二的指纹。

引用此