当前位置: X-MOL 学术Nat. Commun. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Efficient and self-adaptive in-situ learning in multilayer memristor neural networks.
Nature Communications ( IF 16.6 ) Pub Date : 2018-06-19 , DOI: 10.1038/s41467-018-04484-2
Can Li , Daniel Belkin , Yunning Li , Peng Yan , Miao Hu , Ning Ge , Hao Jiang , Eric Montgomery , Peng Lin , Zhongrui Wang , Wenhao Song , John Paul Strachan , Mark Barnell , Qing Wu , R. Stanley Williams , J. Joshua Yang , Qiangfei Xia

Memristors with tunable resistance states are emerging building blocks of artificial neural networks. However, in situ learning on a large-scale multiple-layer memristor network has yet to be demonstrated because of challenges in device property engineering and circuit integration. Here we monolithically integrate hafnium oxide-based memristors with a foundry-made transistor array into a multiple-layer neural network. We experimentally demonstrate in situ learning capability and achieve competitive classification accuracy on a standard machine learning dataset, which further confirms that the training algorithm allows the network to adapt to hardware imperfections. Our simulation using the experimental parameters suggests that a larger network would further increase the classification accuracy. The memristor neural network is a promising hardware platform for artificial intelligence with high speed-energy efficiency.

中文翻译:

多层忆阻器神经网络中的高效且自适应的原位学习。

具有可调电阻状态的忆阻器是人工神经网络的新兴组成部分。但是,由于器件特性工程和电路集成方面的挑战,尚未在大规模多层忆阻器网络上进行原位学习。在这里,我们将基于氧化ha的忆阻器与铸造厂制造的晶体管阵列整体集成到多层神经网络中。我们通过实验证明了原位学习能力,并在标准的机器学习数据集上实现了具有竞争力的分类准确性,这进一步证实了训练算法可以使网络适应硬件缺陷。我们使用实验参数进行的模拟表明,更大的网络将进一步提高分类的准确性。
更新日期:2018-06-19
down
wechat
bug