Skip to main navigation Skip to search Skip to main content

Optimal FPGA-oriented lightweight network architecture search under multi-objective constraints

  • Xiaobin Li
  • , Hongxu Jiang
  • , Fangzheng Tian
  • , Yonghua Zhang
  • , Rui Miao
  • , Donghuan Xu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The neural architecture search (NAS) technology has developed rapidly, but most algorithms mainly target the GPUs, causing the searched models to lack consideration of the target hardware platform. This paper proposes a novel NAS algorithm specific for the unique calculation and storage character of FPGA to accelerate the deployment efficiency of neural networks and automated build the FPGA-oriented neural structure. Firstly, we design the search space based on the lightweight candidate blocks. Then, we measure the real simulation latency of each candidate block on the FPGA platform and for the first time using it to direct the NAS process. Eventually, this paper proposed a differentiable representation function of hardware latency and built a multi-objective NAS framework based on real simulation latency and FLOPS. In addition, by analyzing the relationship between the network convergence process and the model size, we achieve an optimal trade-off balance between precision and latency and build a more suitable network structure for deployment on FPGA. Compared with state-of-the-art manual and automated lightweight models, our method has relatively better accuracy and latency performance. Extensive experiments on the ImageNet dataset show that the proposed method can achieve 2x speedup compared with the similar accuracy manual lightweight network (MobileNet V2) and even get 3x speedup compared with the equivalent auto lightweight network (DARTS).

Original languageEnglish
Title of host publication19th IEEE International Symposium on Parallel and Distributed Processing with Applications, 11th IEEE International Conference on Big Data and Cloud Computing, 14th IEEE International Conference on Social Computing and Networking and 11th IEEE International Conference on Sustainable Computing and Communications, ISPA/BDCloud/SocialCom/SustainCom 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages978-983
Number of pages6
ISBN (Electronic)9781665435741
DOIs
StatePublished - 2021
Event19th IEEE International Symposium on Parallel and Distributed Processing with Applications, 11th IEEE International Conference on Big Data and Cloud Computing, 14th IEEE International Conference on Social Computing and Networking and 11th IEEE International Conference on Sustainable Computing and Communications, ISPA/BDCloud/SocialCom/SustainCom 2021 - New York, United States
Duration: 30 Sep 20213 Oct 2021

Publication series

Name19th IEEE International Symposium on Parallel and Distributed Processing with Applications, 11th IEEE International Conference on Big Data and Cloud Computing, 14th IEEE International Conference on Social Computing and Networking and 11th IEEE International Conference on Sustainable Computing and Communications, ISPA/BDCloud/SocialCom/SustainCom 2021

Conference

Conference19th IEEE International Symposium on Parallel and Distributed Processing with Applications, 11th IEEE International Conference on Big Data and Cloud Computing, 14th IEEE International Conference on Social Computing and Networking and 11th IEEE International Conference on Sustainable Computing and Communications, ISPA/BDCloud/SocialCom/SustainCom 2021
Country/TerritoryUnited States
CityNew York
Period30/09/213/10/21

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 7 - Affordable and Clean Energy
    SDG 7 Affordable and Clean Energy

Keywords

  • FPGA
  • Latency constraint
  • Multi-objective
  • Neural architecture search

Fingerprint

Dive into the research topics of 'Optimal FPGA-oriented lightweight network architecture search under multi-objective constraints'. Together they form a unique fingerprint.

Cite this