摘要
Due to additive operation's dominated computation and simplified network in binary convolutional neural network (BCNN), it is promising for Internet of Things scenarios which demand ultralow power consumption. By means of fully exploiting the in-memory computing advantages and low current consumption design using multilevel cell (MLC) spin-toque transfer magnetic random access memory (STT-MRAM), this paper proposes an MLC-STT-computing in-memory-based computing in-memory architecture to achieve convolutional operation for BCNN to further reduce the power consumption. Simulation results show that compared with the resistive random access memory (RRAM)-and spin orbit torque-STT-MRAM-based counterparts, the architecture proposed in this paper reduces power consumption by 35 × and 59% in Modified National Institute of Standards and Technology data set, respectively.
| 源语言 | 英语 |
|---|---|
| 文章编号 | 8403888 |
| 期刊 | IEEE Transactions on Magnetics |
| 卷 | 54 |
| 期 | 11 |
| DOI | |
| 出版状态 | 已出版 - 11月 2018 |
指纹
探究 'A Multilevel Cell STT-MRAM-Based Computing In-Memory Accelerator for Binary Convolutional Neural Network' 的科研主题。它们共同构成独一无二的指纹。引用此
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver