当前位置: X-MOL 学术ACM J. Emerg. Technol. Comput. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
multiPULPly
ACM Journal on Emerging Technologies in Computing Systems ( IF 2.2 ) Pub Date : 2021-04-15 , DOI: 10.1145/3432815
Adi Eliahu 1 , Ronny Ronen 1 , Pierre-Emmanuel Gaillardon 2 , Shahar Kvatinsky 1
Affiliation  

Computationally intensive neural network applications often need to run on resource-limited low-power devices. Numerous hardware accelerators have been developed to speed up the performance of neural network applications and reduce power consumption; however, most focus on data centers and full-fledged systems. Acceleration in ultra-low-power systems has been only partially addressed. In this article, we present multiPULPly, an accelerator that integrates memristive technologies within standard low-power CMOS technology, to accelerate multiplication in neural network inference on ultra-low-power systems. This accelerator was designated for PULP, an open-source microcontroller system that uses low-power RISC-V processors. Memristors were integrated into the accelerator to enable power consumption only when the memory is active, to continue the task with no context-restoring overhead, and to enable highly parallel analog multiplication. To reduce the energy consumption, we propose novel dataflows that handle common multiplication scenarios and are tailored for our architecture. The accelerator was tested on FPGA and achieved a peak energy efficiency of 19.5 TOPS/W, outperforming state-of-the-art accelerators by 1.5× to 4.5×.

中文翻译:

多浆

计算密集型神经网络应用程序通常需要在资源有限的低功耗设备上运行。已经开发了许多硬件加速器来加速神经网络应用程序的性能并降低功耗;但是,大多数都专注于数据中心和成熟的系统。仅部分解决了超低功耗系统的加速问题。在本文中,我们介绍了 multiPULPly,这是一种将忆阻技术集成到标准低功耗 CMOS 技术中的加速器,用于加速超低功耗系统上神经网络推理中的乘法运算。该加速器被指定用于 PULP,这是一种使用低功耗 RISC-V 处理器的开源微控制器系统。忆阻器被集成到加速器中,仅当内存处于活动状态时才启用功耗,在没有上下文恢复开销的情况下继续任务,并启用高度并行的模拟乘法。为了减少能源消耗,我们提出了新的数据流来处理常见的乘法场景并为我们的架构量身定制。该加速器在 FPGA 上进行了测试,实现了 19.5 TOPS/W 的峰值能效,比最先进的加速器高 1.5 倍至 4.5 倍。
更新日期:2021-04-15
down
wechat
bug