当前位置: X-MOL 学术Cognit. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Why Should We Add Early Exits to Neural Networks?
Cognitive Computation ( IF 4.3 ) Pub Date : 2020-06-17 , DOI: 10.1007/s12559-020-09734-4
Simone Scardapane , Michele Scarpiniti , Enzo Baccarelli , Aurelio Uncini

Deep neural networks are generally designed as a stack of differentiable layers, in which a prediction is obtained only after running the full stack. Recently, some contributions have proposed techniques to endow the networks with early exits, allowing to obtain predictions at intermediate points of the stack. These multi-output networks have a number of advantages, including (i) significant reductions of the inference time, (ii) reduced tendency to overfitting and vanishing gradients, and (iii) capability of being distributed over multi-tier computation platforms. In addition, they connect to the wider themes of biological plausibility and layered cognitive reasoning. In this paper, we provide a comprehensive introduction to this family of neural networks, by describing in a unified fashion the way these architectures can be designed, trained, and actually deployed in time-constrained scenarios. We also describe in-depth their application scenarios in 5G and Fog computing environments, as long as some of the open research questions connected to them.

中文翻译:

为什么我们要为神经网络增加早期退出渠道?

深度神经网络通常被设计为可区分层的堆栈,其中仅在运行完整堆栈后才能获得预测。最近,一些贡献提出了使网络具有较早退出的技术。,允许在堆栈的中间点获得预测。这些多输出网络具有许多优势,包括(i)显着减少推理时间,(ii)降低了过度拟合和消失梯度的趋势以及(iii)分布在多层计算平台上的能力。此外,它们与更广泛的生物学合理性和分层认知推理主题相关联。在本文中,我们以统一的方式描述了可以在受时间限制的场景中设计,训练和实际部署这些体系结构的方式,对神经网络家族进行了全面介绍。我们还深入描述了它们在5G和Fog计算环境中的应用场景,只要有一些与之相关的开放研究问题就可以了。
更新日期:2020-06-17
down
wechat
bug