当前位置: X-MOL 学术Int. J. Circ. Theory Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Reconfigurable field-programmable gate array-based on-chip learning neuromorphic digital implementation for nonlinear function approximation
International Journal of Circuit Theory and Applications ( IF 2.3 ) Pub Date : 2021-06-21 , DOI: 10.1002/cta.3075
Morteza Gholami 1 , Edris Zaman Farsa 2 , Gholamreza Karimi 1
Affiliation  

Hardware implementations of spiking neural networks, which are known as neuromorphic architectures, provide an explicit understanding of brain performance. As a result, biological features of the brain may well inspire the next generation of computers and electronic systems used in such areas as signal processing, image processing, function approximation, and pattern recognition. Approximating nonlinear functions has many uses in computer science and applied mathematics. The sigmoid is the most universal activation function in neural networks by which the relationship between biological and artificial neurons is defined. It is a suitable option for predicting the probability of anything from 0 to 1 as output. In this paper, a spiking neural network using Izhikevich neurons and a gradient descent learning algorithm are propounded to approximate the sigmoid and other nonlinear functions. The flexibility of the spiking network is demonstrated by showing the average relative errors in the approximation process. A time- and cost-efficient digital neuromorphic implementation on the base of on-chip learning method for approximating the sigmoid function is also discussed. The paper reports the results of the hardware synthesis and the spiking network's physical implementation on a field-programmable gate array. The maximum frequency and throughput of the implemented network were 83.209 MHz and 9.86 Mb/s, respectively.

中文翻译:

基于可重构现场可编程门阵列的片上学习神经形态数字实现非线性函数逼近

尖峰神经网络的硬件实现,被称为神经形态架构,提供了对大脑性能的明确理解。因此,大脑的生物特征很可能会启发下一代计算机和电子系统,用于信号处理、图像处理、函数逼近和模式识别等领域。逼近非线性函数在计算机科学和应用数学中有很多用途。sigmoid 是神经网络中最通用的激活函数,通过它定义了生物神经元和人工神经元之间的关系。这是预测从 0 到 1 作为输出的任何概率的合适选项。在本文中,提出了使用 Izhikevich 神经元的尖峰神经网络和梯度下降学习算法来逼近 sigmoid 和其他非线性函数。通过显示近似过程中的平均相对误差来证明尖峰网络的灵活性。还讨论了基于片上学习方法来逼近 sigmoid 函数的时间和成本高效的数字神经形态实现。该论文报告了硬件综合的结果和脉冲网络在现场可编程门阵列上的物理实现。所实施网络的最大频率和吞吐量分别为 83.209 MHz 和 9.86 Mb/s。通过显示近似过程中的平均相对误差来证明尖峰网络的灵活性。还讨论了基于片上学习方法来逼近 sigmoid 函数的时间和成本高效的数字神经形态实现。该论文报告了硬件综合的结果和脉冲网络在现场可编程门阵列上的物理实现。所实施网络的最大频率和吞吐量分别为 83.209 MHz 和 9.86 Mb/s。通过显示近似过程中的平均相对误差来证明尖峰网络的灵活性。还讨论了基于片上学习方法来逼近 sigmoid 函数的时间和成本高效的数字神经形态实现。该论文报告了硬件综合的结果和脉冲网络在现场可编程门阵列上的物理实现。所实施网络的最大频率和吞吐量分别为 83.209 MHz 和 9.86 Mb/s。
更新日期:2021-08-12
down
wechat
bug