Few-Shot Scene Classification with attention mechanism in Remote Sensing

  • Xuanye Li
  • , Hongguang Li*
  • , Ruonan Yu
  • , Fei Wang
  • *Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

Remote sensing scene classification is a hot research topic in computer vision and it is of great significance to the semantic understanding of remote sensing images. At present, remote sensing scene classification methods based on deep learning occupy a dominant position in this field. However, it suffers from the lack of samples and poor model generalization ability in actual application scenarios. Therefore, this paper proposes a few-shot remote scene classification method based on attention mechanism, and designs a structure of dual-branches similarity measurement. This method is based on the meta-learning training strategy to divide the dataset into tasks. At the meantime, the input images are divided into blocks to preserve the feature distribution in the remote sensing image. Then the lightweight attention module is introduced into the feature extraction network to reduce the risk of overfitting and ensure the acquisition of discriminative features. Finally we add a dual-branches similarity measurement module on the basis of Earth Mover's Distance to improve the discriminative ability of the classifier. The results show that compared with the classic small-sample learning method, the few-shot remote scene classification method proposed in this paper can significantly improve the classification performance.

Original languageEnglish
Article number012015
JournalJournal of Physics: Conference Series
Volume1961
Issue number1
DOIs
StatePublished - 6 Jul 2021
Event2021 International Conference on Computer Engineering and Innovative Application of VR, ICCEIA VR 2021 - Guangzhou, Virtual, China
Duration: 11 Jun 202113 Jun 2021

Fingerprint

Dive into the research topics of 'Few-Shot Scene Classification with attention mechanism in Remote Sensing'. Together they form a unique fingerprint.

Cite this