当前位置: X-MOL 学术Int. J. Parallel. Program › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Float-Fix: An Efficient and Hardware-Friendly Data Type for Deep Neural Network
International Journal of Parallel Programming ( IF 1.5 ) Pub Date : 2019-05-29 , DOI: 10.1007/s10766-018-00626-7
Dong Han , Shengyuan Zhou , Tian Zhi , Yibo Wang , Shaoli Liu

AbstractRecent years, as deep learning rose in prominence, neural network accelerators boomed. The existing research shows that both speed and energy-efficiency can be improved by low precision data structure. However, decreasing the precision of data might compromise the usefulness and accuracy of the underlying AI. And the existing studies can not meet all AI application requirements. In the paper, we propose a new data type, called Float-Fix (FF). We introduce the structure of FF and compare it with other data types. In our evaluation, the accuracy loss of 8-bit FF is less than 0.12% on a subset of known neural network models, 7$$\times $$× better than fixed-point, DFX and floating-point on average. We implement the hardware architectures of operators and neural processing unit using 8-bit FF data type with TSMC 65 nm Gplus High VT library. The experiments show that the hardware cost of convertors converting between 16-bit fixed-point and FF is really small. And the multiplier of 8-bit FF only needs 1188 $$\upmu \mathrm{m}^2$$μm2 area, which is nearly 8-bit fixed-point. Comparing with the neural processing unit of DianNao, FF reduces 34.3% area.

中文翻译:

Float-Fix:一种高效且硬件友好的深度神经网络数据类型

摘要 近年来,随着深度学习的兴起,神经网络加速器蓬勃发展。现有研究表明,低精度数据结构可以同时提高速度和能效。但是,降低数据的精度可能会损害底层 AI 的有用性和准确性。而现有的研究并不能满足所有的人工智能应用需求。在论文中,我们提出了一种新的数据类型,称为 Float-Fix (FF)。我们介绍FF的结构并与其他数据类型进行比较。在我们的评估中,在已知神经网络模型的子集上,8 位 FF 的精度损失小于 0.12%,平均比定点、DFX 和浮点模型好 7$$\times $$×。我们使用 8 位 FF 数据类型和 TSMC 65 nm Gplus High VT 库来实现算子和神经处理单元的硬件架构。实验表明,转换器在 16 位定点和 FF 之间转换的硬件成本非常小。而8位FF的乘法器只需要1188个$$\upmu\mathrm{m}^2$$μm2面积,接近8位定点。与点脑的神经处理单元相比,FF 减少了 34.3% 的面积。
更新日期:2019-05-29
down
wechat
bug