Large-Scale Synthetic Urban Dataset for Aerial Scene Understanding

  • Qian Gao*
  • , Xukun Shen
  • , Wensheng Niu
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The geometric extraction and semantic understanding in bird's eye view plays an important role in cyber-physical-social systems (CPSS), because it can help human or intelligent agents (IAs) to perceive larger range of environment. Moreover, due to lack of comprehensive dataset from oblique perspective, fog-end deep learning algorithms for this purpose is still in blank. In this paper, we propose a novel method to generate synthetic large-scale dataset for geometric and semantic urban scene understanding from bird's eye view. There are two main steps involved, one is modeling and the other is rendering, which are processed by CityEngine and UnrealEngine4 respectively. In this way, synthetic aligned multi-model data are obtained efficiently, including spectral images, semantic labels, depth and normal maps. Specifically, terrain elevation, street graph, building style and trees distribution are all randomly generated according realistic situation, a few of handcrafted semantic labels annotated by colors spread throughout the scene, virtual cameras moved according to realistic trajectories of unmanned aerial vehicles (UAVs). For evaluation of practicability of our dataset, we manually labeled tens of aerial images downloaded from internet. And the experiment result show that, in both pure and combined mode, the dataset can improve the performance significantly.

Original languageEnglish
Article number9015998
Pages (from-to)42131-42140
Number of pages10
JournalIEEE Access
Volume8
DOIs
StatePublished - 2020

Keywords

  • Deep learning
  • environment understanding
  • geometric extraction
  • large-scale urban scene
  • semantic segmentation
  • synthetic data set

Fingerprint

Dive into the research topics of 'Large-Scale Synthetic Urban Dataset for Aerial Scene Understanding'. Together they form a unique fingerprint.

Cite this