当前位置: X-MOL 学术IEEE Comput. Archit. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Towards Scalable Analytics with Inference-enabled Solid-state Drives
IEEE Computer Architecture Letters ( IF 1.4 ) Pub Date : 2020-01-01 , DOI: 10.1109/lca.2019.2930590
Minsub Kim , Jaeha Kung , Sungjin Lee

In this paper, we propose a novel storage architecture, called an Inference-Enabled SSD (IESSD), which employs FPGA-based DNN inference accelerators inside an SSD. IESSD is capable of performing DNN operations inside an SSD, avoiding frequent data movements between application servers and data storage. This boosts up analytics performance of DNN applications. Moreover, by placing accelerators near data within an SSD, IESSD delivers scalable analytics performance which improves with the amount of data to analyze. To evaluate its effectiveness, we implement an FPGA-based proof-of-concept prototype of IESSD and carry out a case study with an image tagging (classification) application. Our preliminary results show that IESSD exhibits 1.81× better performance, achieving 5.31× lower power consumption, over a conventional system with GPU accelerators.

中文翻译:

使用支持推理的固态驱动器实现可扩展分析

在本文中,我们提出了一种新的存储架构,称为推理支持 SSD (IESSD),它在 SSD 内采用基于 FPGA 的 DNN 推理加速器。IESSD 能够在 SSD 内部执行 DNN 操作,避免应用服务器和数据存储之间频繁的数据移动。这提高了 DNN 应用程序的分析性能。此外,通过在 SSD 内的数据附近放置加速器,IESD 提供可扩展的分析性能,随着要分析的数据量的增加而提高。为了评估其有效性,我们实现了基于 FPGA 的 IESSD 概念验证原型,并使用图像标记(分类)应用程序进行案例研究。我们的初步结果表明,与带有 GPU 加速器的传统系统相比,IESSD 的性能提高了 1.81 倍,功耗降低了 5.31 倍。
更新日期:2020-01-01
down
wechat
bug