Skip to main navigation Skip to search Skip to main content

Projection based weight normalization: Efficient method for optimization on oblique manifold in DNNs

Research output: Contribution to journalArticlepeer-review

Abstract

Optimizing deep neural networks (DNNs) often suffers from the ill-conditioned problem. We observe that the scaling based weight space symmetry (SBWSS) in rectified nonlinear network will cause this negative effect. Therefore, we propose to constrain the incoming weights of each neuron to be unit-norm, which is formulated as an optimization problem over the Oblique manifold. A simple yet efficient method referred to as projection based weight normalization (PBWN) is also developed to solve this problem. This proposed method has the property of regularization and collaborates well with the commonly used batch normalization technique. We conduct comprehensive experiments on several widely-used image datasets including CIFAR-10, CIFAR-100, SVHN and ImageNet for supervised learning over the state-of-the-art neural networks. The experimental results show that our method is able to improve the performance of different architectures consistently. We also apply our method to Ladder network for semi-supervised learning on permutation invariant MNIST dataset, and our method achievers the state-of-the-art methods: we obtain test errors as 2.52%, 1.06%, and 0.91% with only 20, 50, and 100 labeled samples, respectively.

Original languageEnglish
Article number107317
JournalPattern Recognition
Volume105
DOIs
StatePublished - Sep 2020

Keywords

  • Deep learning
  • Image classification
  • Oblique manifold
  • Weight normalization

Fingerprint

Dive into the research topics of 'Projection based weight normalization: Efficient method for optimization on oblique manifold in DNNs'. Together they form a unique fingerprint.

Cite this