Remote Sensing Image Compression in Visible/Near-Infrared Range Using Heterogeneous Compressive Sensing

  • Jin Li
  • , Yao Fu
  • , Guoning Li
  • , Zilong Liu*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Compressive sensing (CS) framework is very suitable for onboard image compression of high-resolution remote sensing cameras in the visible/near-infrared range (VI/NI-RSC) because it has the low-complexity in the sampling measurement stage. In this paper, we propose a new heterogeneous CS method for VI/NI-RSCs. Different from conventional CS methods evenly allocating sensing resources, the proposed method fully employs texture-feature information of remote sensing images to guide the allocation of sensing resources. More sensing resources are allocated to high-frequency regions, but fewer to low-frequency regions. The heterogeneous distribution of sensing resources obtains high reconstruction quality at the same compression performance, as well as high compression performance at the same level reconstructed quality. The shift of sensing resources is consistent with artificial image interpretations, i.e., human visual characteristics, where high-frequency regions, such as edges and textures, are the principal proof of the ground target identification. Experimental results indicate that the proposed method has better reconstruction quality than conventional CS method where texture-features are not utilized.

Original languageEnglish
Article number8588994
Pages (from-to)4932-4938
Number of pages7
JournalIEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Volume11
Issue number12
DOIs
StatePublished - Dec 2018
Externally publishedYes

Keywords

  • Heterogeneous compressive sensing (CS)
  • panchromatic images
  • remote sensing image compression

Fingerprint

Dive into the research topics of 'Remote Sensing Image Compression in Visible/Near-Infrared Range Using Heterogeneous Compressive Sensing'. Together they form a unique fingerprint.

Cite this