Skip to main content
Log in

Improved integrate-and-fire neuron models for inference acceleration of spiking neural networks

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

We study the effects of different bio-synaptic membrane potential mechanisms on the inference speed of both spiking feed-forward neural networks and spiking convolutional neural networks. These mechanisms are inspired by biological neuron phenomena include electronic conduction in neurons and chemical neurotransmitter attenuation between presynaptic and postsynaptic neurons. In the area of spiking neural networks, we model some biological neural membrane potential updating strategies based on integrate-and-fire (I&F) spiking neurons. These include the spiking neuron model with membrane potential decay (MemDec), the spiking neuron model with synaptic input current superposition at spiking time (SynSup), and the spiking neuron model with synaptic input current accumulation (SynAcc). Experiment results show that compared with the general I&F model (one of the most commonly used spiking neuron models), SynSup and SynAcc can effectively improve the spiking inference speed of spiking feed-forward neural networks and spiking convolutional neural networks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Bodyanskiy Y, Dolotov A, Pliss I, Malyar M (2016) A fast learning algorithm of self-learning spiking neural network. In: IEEE international conference on data stream mining & processing, pp 104–107. https://doi.org/10.1109/DSMP.2016.7583517

  2. Bohte SM, Kok JN, Poutre HL (2002) Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48(1):17–37. https://doi.org/10.1016/S0925-2312(01)00658-0

    Article  MATH  Google Scholar 

  3. Camunas-Mesa L, Zamarreno-Ramos C, Linares-barranco A, et al. (2012) An event-driven multi-kernel convolution processor module for event-driven vision sensors. IEEE Journal of Solid-State Circuits 47 (2):504–517. https://doi.org/10.1109/JSSC.2011.2167409

    Article  Google Scholar 

  4. Cao Y, Chen Y (2014) Spiking deep convolutional neural networks for energy-efficient object recognition. Int J Comput Vis 113(1):54–66. https://doi.org/10.1007/s11263-014-0788-3

    Article  MathSciNet  Google Scholar 

  5. Carlson KD, Nageswaran JM, Dutt N, Krichmar JL (2014) An efficient automated parameter tuning framework for spiking neural networks. Front Neurosci 8. https://doi.org/10.3389/fnins.2014.00010

  6. Diehl PU, Neil D, Binas J, Cook M, Liu SC, Pfeiffer M (2015) Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In: International joint conference on neural networks. https://doi.org/10.1109/IJCNN.2015.7280696

  7. Fang H, Wang Y, He J (2010) Spiking neural network for cortical neuronal spike train decoding. Neural Comput 22(4):1060–1085. https://doi.org/10.1162/neco.2009.10-08-885

    Article  MATH  Google Scholar 

  8. Gutig R, Sompolinsky H (2006) The tempotron: a neuron that learns spike timing-based decisions. Nat Neurosci 9(3):420–428. https://doi.org/10.1038/nn1643

    Article  Google Scholar 

  9. Huang S, Rozas C, et al. (2014) Associate hebbian synaptic plasticity in primate visual cortex. J Neurosci 34(22):7575–7579. https://doi.org/10.1523/JNEUROSCI.0983-14.2014

    Article  Google Scholar 

  10. Humphries MD, Gurney K (2007) Solution methods for a new class of simple model neurons. Neural Comput 19(12):3216–3225. https://doi.org/10.1162/neco.2007.19.12.3216

    Article  MathSciNet  MATH  Google Scholar 

  11. Johnson AP, Liu J, Millard A, et al. (2017) Homeostatic fault tolerance in spiking neural networks: A dynamic hardware perspective. IEEE Transactions on Circuits and Systems-I: Regular Papers, pp 1–13. https://doi.org/10.1109/TCSI.2017.2726763

  12. Kennedy J, Eberhart R (1995) Particle swarm optimization. In: IEEE International Conference on Neural Networks, pp 4

  13. Lehky SR (2010) Decoding poisson spike trains by gaussian filtering. Neural Comput 22 (5):1245–1271. https://doi.org/10.1162/neco.2009.07-08-823

    Article  MATH  Google Scholar 

  14. Lin Z, Shen J, Ma D, Meng J (2017) Quantisation and pooling method for low-inference-latency spiking neural networks. Electron Lett 53(20):1347–1348. https://doi.org/10.1049/el.2017.2219

    Article  Google Scholar 

  15. Maass W (1997) Networks of spiking neurons: The third generation of neural network models. Neural Netw 9(10):1659–1672. https://doi.org/10.1016/S0893-6080(97)00011-7

    Article  Google Scholar 

  16. Masquelier T, Thorpe SJ (2007) Unsupervised learning of visual features through spike timing dependent plasticity. PLoS Comput Biol 3(2):e31. https://doi.org/10.1371/journal.pcbi.0030031

    Article  Google Scholar 

  17. McMahon DB, Leoplod DA (2012) Stimuls timing-dependent plasticity in high-level vision. Curr Biol 22(4):332–337. https://doi.org/10.1016/j.cub.2012.01.003

    Article  Google Scholar 

  18. Meliza CD, Dan Y (2006) Receptive-field modification in rat visual cortex induced by paired visual stimulation and single-cell spiking. Neuron 49(2):183–189. https://doi.org/10.1016/j.neuron.2005.12.009

    Article  Google Scholar 

  19. Merolla P, Arthur J, Akopyan F, et al. (2011) A digital neurosynaptic core using embedded crossbar memory with 45pj per spike in 45nm. IEEE Custom Integrated Circuits Conference. https://doi.org/10.1109/CICC.2011.6055294

  20. O’Connor P, Neil D, Liu SC, Delbruck T, Pfeiffer M (2013) Real-time classification and sensor fusion with a spiking deep belief network. Front Neurosci 7. https://doi.org/10.3389/fnins.2013.00178

  21. Rossum V, W MC, Turrigiano GG, Nelson SB (2002) Fast propagation of firing rates through layered networks of noisy neurons. J Neurosci 22(5):1956–1966. https://doi.org/10.1523/JNEUROSCI.22-05-01956.2002

    Article  Google Scholar 

  22. Rueckauer B, Lungu IA, Hu Y, Pfeiffer M (2016) Theory and tools for the conversion of analog to spiking convolutional neural networks. arXiv:1612.04052 [cs, stat]

  23. Schrauwen B, Campenhout JV (2004) Improving spike-prop: Enhancements to an error-backpropagation rule for spiking neural networks. Proceedings of ProRISC Workshop 11:301–305

    Google Scholar 

  24. Skocik MJ, Long LN (2014) On the capabilities and computational costs of neuron models. IEEE Trans Neural Netw Learn Syst 25(8):1474–1483. https://doi.org/10.1109/TNNLS.2013.2294016

    Article  Google Scholar 

  25. Tino P, Mills AJS (2006) Learning beyond finite memory in recurrent networks of spiking neurons. Neural Comput 18(3):591–613. https://doi.org/10.1162/neco.2006.18.3.591

    Article  MathSciNet  MATH  Google Scholar 

  26. Ventura V, Todorova S (2015) A computationally efficient method for incorporating spike waveform information into decoding algorithms. Neural Comput 25(5):1033–1050

    Article  MathSciNet  Google Scholar 

  27. Zhang A, Zhou H, Li X, Zhu W (2019) Fast and robust learning in Spiking Feed-forward Neural Networks based on Intrinsic Plasticity mechanism. Neurocomputing 365:102–112. https://doi.org/10.1016/j.neucom.2019.07.009

    Article  Google Scholar 

  28. Zhang S, Zhang A, Ma Y, Zhu W (2019) Intrinsic plasticity based inference acceleration for spiking Multi-Layer perceptron. IEEE Access 7:73685–73693. https://doi.org/10.1109/ACCESS.2019.2914424

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anguo Zhang.

Ethics declarations

Conflict of interests

The authors declare that they have no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, Y., Zhang, A. Improved integrate-and-fire neuron models for inference acceleration of spiking neural networks. Appl Intell 51, 2393–2405 (2021). https://doi.org/10.1007/s10489-020-02017-3

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-020-02017-3

Keywords

Navigation