Skip to main navigation Skip to search Skip to main content

Information bottleneck and selective noise supervision for zero-shot learning

  • Lei Zhou
  • , Yang Liu
  • , Pengcheng Zhang
  • , Xiao Bai*
  • , Lin Gu
  • , Jun Zhou
  • , Yazhou Yao
  • , Tatsuya Harada
  • , Jin Zheng
  • , Edwin Hancock
  • *Corresponding author for this work
  • Beihang University
  • RIKEN
  • The University of Tokyo
  • Griffith University Queensland
  • Nanjing University of Science and Technology
  • University of York

Research output: Contribution to journalArticlepeer-review

Abstract

Zero-shot learning (ZSL) aims to recognize novel classes by transferring semantic knowledge from seen classes to unseen classes. Though many ZSL methods rely on a direct mapping between the visual and the semantic space, the calibration deviation and hubness problem limit the generalization capability to unseen classes. Recently emerged generative ZSL methods generate unseen image features to transform ZSL into a supervised classification problem. However, most generative models still suffer from the seen-unseen bias problem as only seen data is used for training. To address these issues, we propose a novel bidirectional embedding based generative model with a tight visual-semantic coupling constraint. We learn a unified latent space that calibrates the embedded parametric distributions of both visual and semantic spaces. Since the embedding from high-dimensional visual features comprises much non-semantic information, the alignment of visual and semantic in latent space would inevitably be deviated. Therefore, we introduce an information bottleneck constraint to ZSL for the first time to preserve essential attribute information during the mapping. Specifically, we utilize the uncertainty estimation and the wake-sleep procedure to alleviate the feature noises and improve model abstraction capability. In addition, our method can be easily extended to the transductive ZSL setting by generating labels for unseen images. We then introduce a robust self-training loss to solve this label-noise problem. Extensive experimental results show that our method outperforms the state-of-the-art methods in different ZSL settings on most benchmark datasets.

Original languageEnglish
Pages (from-to)2239-2261
Number of pages23
JournalMachine Learning
Volume112
Issue number7
DOIs
StatePublished - Jul 2023

Keywords

  • Information bottleneck
  • Label-noise learning
  • Transductive ZSL
  • Uncertainty estimation
  • Zero-shot learning

Fingerprint

Dive into the research topics of 'Information bottleneck and selective noise supervision for zero-shot learning'. Together they form a unique fingerprint.

Cite this