当前位置: X-MOL 学术J. Sign. Process. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Deep Model Compression and Architecture Optimization for Embedded Systems: A Survey
Journal of Signal Processing Systems ( IF 1.6 ) Pub Date : 2020-10-12 , DOI: 10.1007/s11265-020-01596-1
Anthony Berthelier , Thierry Chateau , Stefan Duffner , Christophe Garcia , Christophe Blanc

Over the past, deep neural networks have proved to be an essential element for developing intelligent solutions. They have achieved remarkable performances at a cost of deeper layers and millions of parameters. Therefore utilising these networks on limited resource platforms for smart cameras is a challenging task. In this context, models need to be (i) accelerated and (ii) memory efficient without significantly compromising on performance. Numerous works have been done to obtain smaller, faster and accurate models. This paper presents a survey of methods suitable for porting deep neural networks on resource-limited devices, especially for smart cameras. These methods can be roughly divided in two main sections. In the first part, we present compression techniques. These techniques are categorized into: knowledge distillation, pruning, quantization, hashing, reduction of numerical precision and binarization. In the second part, we focus on architecture optimization. We introduce the methods to enhance networks structures as well as neural architecture search techniques. In each of their parts, we describe different methods, and analyse them. Finally, we conclude this paper with a discussion on these methods.



中文翻译:

嵌入式系统的深度模型压缩和体系结构优化:一项调查

过去,深度神经网络已被证明是开发智能解决方案的必要元素。他们以更深的层次和数百万个参数为代价取得了非凡的性能。因此,在有限资源平台上将这些网络用于智能相机是一项艰巨的任务。在这种情况下,模型需要(i)加速和(ii)高效存储,而不会显着影响性能。为了获得更小,更快和更准确的模型,已经做了许多工作。本文对适用于在资源有限的设备(尤其是智能相机)上移植深度神经网络的方法进行了概述。这些方法可以大致分为两个主要部分。在第一部分中,我们介绍了压缩技术。这些技术分为:知识蒸馏,修剪,量化,散列,数值精度降低和二值化。在第二部分中,我们专注于架构优化。我们介绍了增强网络结构的方法以及神经体系结构搜索技术。在它们的每个部分中,我们描述了不同的方法并进行了分析。最后,我们通过对这些方法的讨论来结束本文。

更新日期:2020-10-12
down
wechat
bug