当前位置: X-MOL 学术arXiv.cs.ET › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
sBSNN: Stochastic-Bits Enabled Binary Spiking Neural Network with On-Chip Learning for Energy Efficient Neuromorphic Computing at the Edge
arXiv - CS - Emerging Technologies Pub Date : 2020-02-25 , DOI: arxiv-2002.11163
Minsuk Koo, Gopalakrishnan Srinivasan, Yong Shim, and Kaushik Roy

In this work, we propose stochastic Binary Spiking Neural Network (sBSNN) composed of stochastic spiking neurons and binary synapses (stochastic only during training) that computes probabilistically with one-bit precision for power-efficient and memory-compressed neuromorphic computing. We present an energy-efficient implementation of the proposed sBSNN using 'stochastic bit' as the core computational primitive to realize the stochastic neurons and synapses, which are fabricated in 90nm CMOS process, to achieve efficient on-chip training and inference for image recognition tasks. The measured data shows that the 'stochastic bit' can be programmed to mimic spiking neurons, and stochastic Spike Timing Dependent Plasticity (or sSTDP) rule for training the binary synaptic weights without expensive random number generators. Our results indicate that the proposed sBSNN realization offers possibility of up to 32x neuronal and synaptic memory compression compared to full precision (32-bit) SNN and energy efficiency of 89.49 TOPS/Watt for two-layer fully-connected SNN.

中文翻译:

sBSNN:具有片上学习的随机位启用二进制尖峰神经网络,用于边缘的节能神经形态计算

在这项工作中,我们提出了由随机尖峰神经元和二进制突触(仅在训练期间随机)组成的随机二进制尖峰神经网络 (sBSNN),它以一位精度进行概率计算,以实现节能和内存压缩的神经形态计算。我们提出了所提出的 sBSNN 的节能实现,使用“随机位”作为核心计算原语来实现在 90nm CMOS 工艺中制造的随机神经元和突触,以实现对图像识别任务的高效片上训练和推理. 测量数据表明,“随机位”可以被编程来模拟尖峰神经元,以及用于训练二进制突触权重的随机尖峰时间依赖可塑性(或 sSTDP)规则,而无需昂贵的随机数生成器。
更新日期:2020-02-27
down
wechat
bug