当前位置: X-MOL 学术Mach. Learn. Sci. Technol. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Fast convolutional neural networks on FPGAs with hls4ml
Machine Learning: Science and Technology ( IF 6.013 ) Pub Date : 2021-07-16 , DOI: 10.1088/2632-2153/ac0ea1
Thea Aarrestad 1 , Vladimir Loncar 1 , Nicol Ghielmetti 1 , Maurizio Pierini 1 , Sioni Summers 1 , Jennifer Ngadiuba 2 , Christoffer Petersson 3 , Hampus Linander 3 , Yutaro Iiyama 4 , Giuseppe Di Guglielmo 5 , Javier Duarte 6 , Philip Harris 7 , Dylan Rankin 7 , Sergo Jindariani 8 , Kevin Pedro 8 , Nhan Tran 8 , Mia Liu 9 , Edward Kreinar 10 , Zhenbin Wu 11 , Duc Hoang 12
Affiliation  

We introduce an automated tool for deploying ultra low-latency, low-power deep neural networks with convolutional layers on field-programmable gate arrays (FPGAs). By extending the hls4ml library, we demonstrate an inference latency of 5 s using convolutional architectures, targeting microsecond latency applications like those at the CERN Large Hadron Collider. Considering benchmark models trained on the Street View House Numbers Dataset, we demonstrate various methods for model compression in order to fit the computational constraints of a typical FPGA device used in trigger and data acquisition systems of particle detectors. In particular, we discuss pruning and quantization-aware training, and demonstrate how resource utilization can be significantly reduced with little to no loss in model accuracy. We show that the FPGA critical resource consumption can be reduced by 97% with zero loss in model accuracy, and by 99% when tolerating a 6% accuracy degradation.



中文翻译:

使用 hls4ml 的 FPGA 上的快速卷积神经网络

我们介绍了一种自动化工具,用于在现场可编程门阵列 (FPGA) 上部署具有卷积层的超低延迟、低功耗深度神经网络。通过扩展hls4ml库,我们展示了 5 的推理延迟 s 使用卷积架构,针对像 CERN 大型强子对撞机那样的微秒延迟应用程序。考虑在街景房屋号码数据集上训练的基准模型,我们展示了各种模型压缩方法,以适应粒子探测器的触发和数据采集系统中使用的典型 FPGA 设备的计算约束。特别是,我们讨论了剪枝和量化感知训练,并展示了如何在模型精度几乎没有损失的情况下显着降低资源利用率。我们表明,FPGA 关键资源消耗可以减少 97%,模型精度损失为零,并且在容忍 6% 精度下降时可以减少 99%。

更新日期:2021-07-16
down
wechat
bug