当前位置: X-MOL 学术Complex Intell. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Memristor crossbar architectures for implementing deep neural networks
Complex & Intelligent Systems ( IF 5.0 ) Pub Date : 2021-07-20 , DOI: 10.1007/s40747-021-00282-4
Xiaoyang Liu 1, 2 , Zhigang Zeng 1, 2
Affiliation  

The paper presents memristor crossbar architectures for implementing layers in deep neural networks, including the fully connected layer, the convolutional layer, and the pooling layer. The crossbars achieve positive and negative weight values and approximately realize various nonlinear activation functions. Then the layers constructed by the crossbars are adopted to build the memristor-based multi-layer neural network (MMNN) and the memristor-based convolutional neural network (MCNN). Two kinds of in-situ weight update schemes, which are the fixed-voltage update and the approximately linear update, respectively, are used to train the networks. Consider variations resulted from the inherent characteristics of memristors and the errors of programming voltages, the robustness of MMNN and MCNN to these variations is analyzed. The simulation results on standard datasets show that deep neural networks (DNNs) built by the memristor crossbars work satisfactorily in pattern recognition tasks and have certain robustness to memristor variations.



中文翻译:

用于实现深度神经网络的忆阻器交叉架构

该论文提出了用于在深度神经网络中实现层的忆阻器交叉架构,包括全连接层、卷积层和池化层。横杆实现正负权重值,并近似实现各种非线性激活函数。然后采用crossbar构造的层来构建基于忆阻器的多层神经网络(MMNN)和基于忆阻器的卷积神经网络(MCNN)。两种原位权重更新方案,分别是固定电压更新和近似线性更新,用于训练网络。考虑到忆阻器的固有特性和编程电压误差导致的变化,分析了MMNN和MCNN对这些变化的鲁棒性。

更新日期:2021-07-20
down
wechat
bug