跳到主要导航 跳到搜索 跳到主要内容

A Multilevel Cell STT-MRAM-Based Computing In-Memory Accelerator for Binary Convolutional Neural Network

科研成果: 期刊稿件文章同行评审

摘要

Due to additive operation's dominated computation and simplified network in binary convolutional neural network (BCNN), it is promising for Internet of Things scenarios which demand ultralow power consumption. By means of fully exploiting the in-memory computing advantages and low current consumption design using multilevel cell (MLC) spin-toque transfer magnetic random access memory (STT-MRAM), this paper proposes an MLC-STT-computing in-memory-based computing in-memory architecture to achieve convolutional operation for BCNN to further reduce the power consumption. Simulation results show that compared with the resistive random access memory (RRAM)-and spin orbit torque-STT-MRAM-based counterparts, the architecture proposed in this paper reduces power consumption by 35 × and 59% in Modified National Institute of Standards and Technology data set, respectively.

源语言英语
文章编号8403888
期刊IEEE Transactions on Magnetics
54
11
DOI
出版状态已出版 - 11月 2018

指纹

探究 'A Multilevel Cell STT-MRAM-Based Computing In-Memory Accelerator for Binary Convolutional Neural Network' 的科研主题。它们共同构成独一无二的指纹。

引用此