Abstract
Recently, deep neural networks (DNNs) promote mainly by network architectures and loss functions; however, the development of neuron models has been quite limited. In this study, inspired by the mechanism of human cognition, a hyper-sausage coverage function (HSCF) neuron model possessing a high flexible plasticity. Then, a novel cross-entropy and volume-coverage (CE_VC) loss is defined, which compresses the volume of the hyper-sausage to the hilt, and helps alleviate confusion among different classes, thus ensuring the intra-class compactness of the samples. Finally, a divisive iteration method is introduced, which considers each neuron model as a weak classifier, and iteratively increases the number of weak classifiers. Thus, the optimal number of the HSCF neuron is adaptively determined and an end-to-end learning framework is constructed. In particular, to improve the classification performance, the HSCF neuron can be applied to classical DNNs. Comprehensive experiments on eight datasets in several domains demonstrate the effectiveness of the proposed method. The proposed method exhibits the feasibility of boosting DNNs with neuron plasticity and provides a novel perspective for further developments in DNNs. The source code is available at https://github.com/Tough2011/HSCFNet.git.
| Original language | English |
|---|---|
| Article number | 109216 |
| Journal | Pattern Recognition |
| Volume | 136 |
| DOIs | |
| State | Published - Apr 2023 |
Keywords
- Brain-inspired
- Computer vision
- Deep neural networks
- Neuron model
- Pattern recognition
Fingerprint
Dive into the research topics of 'Hyper-sausage coverage function neuron model and learning algorithm for image classification'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver