当前位置: X-MOL 学术ACM Comput. Surv. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Survey: Exploiting Data Redundancy for Optimization of Deep Learning
ACM Computing Surveys ( IF 16.6 ) Pub Date : 2023-02-02 , DOI: 10.1145/3564663
Jou-An Chen 1 , Wei Niu , Bin Ren 2 , Yanzhi Wang 3 , Xipeng Shen 1
Affiliation  

Data redundancy is ubiquitous in the inputs and intermediate results of Deep Neural Networks (DNN). It offers many significant opportunities for improving DNN performance and efficiency and has been explored in a large body of work. These studies have scattered in many venues across several years. The targets they focus on range from images to videos and texts, and the techniques they use to detect and exploit data redundancy also vary in many aspects. There is not yet a systematic examination and summary of the many efforts, making it difficult for researchers to get a comprehensive view of the prior work, the state of the art, differences and shared principles, and the areas and directions yet to explore. This article tries to fill the void. It surveys hundreds of recent papers on the topic, introduces a novel taxonomy to put the various techniques into a single categorization framework, offers a comprehensive description of the main methods used for exploiting data redundancy in improving multiple kinds of DNNs on data, and points out a set of research opportunities for future exploration.



中文翻译:

调查:利用数据冗余优化深度学习

数据冗余在深度神经网络 (DNN)的输入和中间结果中无处不在. 它为提高 DNN 性能和效率提供了许多重要机会,并已在大量工作中进行了探索。多年来,这些研究分散在许多地方。他们关注的目标范围从图像到视频和文本,他们用于检测和利用数据冗余的技术在许多方面也各不相同。目前还没有对许多努力进行系统的检查和总结,这使得研究人员很难全面了解先前的工作、现有技术、差异和共同原则,以及尚待探索的领域和方向。本文试图填补这一空白。它调查了数百篇关于该主题的近期论文,引入了一种新颖的分类法,将各种技术置于一个单一的分类框架中,

更新日期:2023-02-02
down
wechat
bug