Skip to main navigation Skip to search Skip to main content

On the importance of network architecture in training very deep neural networks

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Very deep neural networks with hundreds or more layers have achieved significant success in a variety of vision tasks spanning from image classification, detection, to image captioning. However, simply stacking more layers in the convolution operation could suffer from the gradient vanishing problem and thus could not lower down the training loss further. The residual network [1] pushes the model's depth to extremely deep by proposing an identity mapping plus a residual learning term and addresses the gradient back-propagation bottleneck well. In this paper, we investigate the residual module in great extent by analyzing the structure ordering of different blocks and modify them one by one to achieve lower test error on CIFAR-10 dataset. One key observation is that removing the original ReLU activation could facilitate the gradient propagation in the identity mapping path. Moreover, inspired by the ResNet block, we propose a random-jump scheme to skip some residual connections during training, i.e., lower features could jump to any subsequent layers and bypass its transformations directly to the higher level. Such an upgrade to the network structure not only saves training time but also obtains better performance.

Original languageEnglish
Title of host publicationICSPCC 2016 - IEEE International Conference on Signal Processing, Communications and Computing, Conference Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781509027088
DOIs
StatePublished - 22 Nov 2016
Externally publishedYes
Event2016 IEEE International Conference on Signal Processing, Communications and Computing, ICSPCC 2016 - Hong Kong, China
Duration: 5 Aug 20168 Aug 2016

Publication series

NameICSPCC 2016 - IEEE International Conference on Signal Processing, Communications and Computing, Conference Proceedings

Conference

Conference2016 IEEE International Conference on Signal Processing, Communications and Computing, ICSPCC 2016
Country/TerritoryChina
CityHong Kong
Period5/08/168/08/16

Fingerprint

Dive into the research topics of 'On the importance of network architecture in training very deep neural networks'. Together they form a unique fingerprint.

Cite this