当前位置: X-MOL 学术J. Sign. Process. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Efficient Design of Pruned Convolutional Neural Networks on FPGA
Journal of Signal Processing Systems ( IF 1.6 ) Pub Date : 2020-11-14 , DOI: 10.1007/s11265-020-01606-2
Mário Véstias

Convolutional Neural Networks (CNNs) have improved several computer vision applications, like object detection and classification, when compared to other machine learning algorithms. Running these models in edge computing devices close to data sources is attracting the attention of the community since it avoids high-latency data communication of private data for cloud processing and permits real-time decisions turning these systems into smart embedded devices. Running these models is computationally very demanding and requires a large amount of memory, which are scarce in edge devices compared to a cloud center. In this paper, we proposed an architecture for the inference of pruned convolutional neural networks in any density FPGAs. A configurable block pruning method is proposed together with an architecture that supports the efficient execution of pruned networks. Also, pruning and batching are studied together to determine how they influence each other. With the proposed architecture, we run the inference of a CNN with an average performance of 322 GOPs for 8-bit data in a XC7Z020 FPGA. The proposed architecture running AlexNet processes 240 images/s in a ZYNQ7020 and 775 images/s in a ZYNQ7045 with only 1.2% accuracy degradation.



中文翻译:

在FPGA上修剪卷积神经网络的高效设计

与其他机器学习算法相比,卷积神经网络(CNN)改进了几种计算机视觉应用程序,例如对象检测和分类。在靠近数据源的边缘计算设备中运行这些模型引起了社区的关注,因为它避免了用于云处理的私有数据的高延迟数据通信,并允许实时决策将这些系统变成智能嵌入式设备。运行这些模型对计算的要求很高,并且需要大量的内存,与云中心相比,这些设备在边缘设备中是稀缺的。在本文中,我们提出了一种用于在任何密度FPGA中进行修剪的卷积神经网络推理的体系结构。提出了一种可配置的块修剪方法以及一种支持修剪网络有效执行的体系结构。此外,修剪和批处理将一起研究以确定它们如何相互影响。借助提出的架构,我们在XC7Z020 FPGA中对8位数据的CNN的平均性能进行了322 GOP的推断。提议的运行AlexNet的体系结构在ZYNQ7020中处理240幅图像/秒,在ZYNQ7045中处理775幅图像/秒,而准确性仅下降1.2%。

更新日期:2020-11-15
down
wechat
bug