当前位置: X-MOL 学术J. Supercomput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Performance evaluation of edge-computing platforms for the prediction of low temperatures in agriculture using deep learning
The Journal of Supercomputing ( IF 2.5 ) Pub Date : 2020-04-29 , DOI: 10.1007/s11227-020-03288-w
Miguel A. Guillén , Antonio Llanes , Baldomero Imbernón , Raquel Martínez-España , Andrés Bueno-Crespo , Juan-Carlos Cano , José M. Cecilia

The Internet of Things (IoT) is driving the digital revolution. AlSome palliative measures aremost all economic sectors are becoming “Smart” thanks to the analysis of data generated by IoT. This analysis is carried out by advance artificial intelligence (AI) techniques that provide insights never before imagined. The combination of both IoT and AI is giving rise to an emerging trend, called AIoT, which is opening up new paths to bring digitization into the new era. However, there is still a big gap between AI and IoT, which is basically in the computational power required by the former and the lack of computational resources offered by the latter. This is particularly true in rural IoT environments where the lack of connectivity (or low-bandwidth connections) and power supply forces the search for “efficient” alternatives to provide computational resources to IoT infrastructures without increasing power consumption. In this paper, we explore edge computing as a solution for bridging the gaps between AI and IoT in rural environment. We evaluate the training and inference stages of a deep-learning-based precision agriculture application for frost prediction in modern Nvidia Jetson AGX Xavier in terms of performance and power consumption. Our experimental results reveal that cloud approaches are still a long way off in terms of performance, but the inclusion of GPUs in edge devices offers new opportunities for those scenarios where connectivity is still a challenge.

中文翻译:

使用深度学习预测农业低温的边缘计算平台的性能评估

物联网 (IoT) 正在推动数字革命。由于对物联网生成的数据的分析,一些缓解措施几乎所有经济部门都在变得“智能”。这种分析是通过先进的人工智能 (AI) 技术进行的,这些技术提供了前所未有的洞察力。IoT 和 AI 的结合正在催生一种称为 AIoT 的新兴趋势,它为将数字化带入新时代开辟了新途径。但是,人工智能和物联网还有很大的差距,基本上是前者所需的计算能力和后者提供的计算资源不足。在农村物联网环境中尤其如此,在这些环境中,缺乏连接(或低带宽连接)和电源迫使人们寻找“高效”的替代方案,在不增加功耗的情况下为物联网基础设施提供计算资源。在本文中,我们探索边缘计算作为弥合农村环境中人工智能和物联网之间差距的解决方案。我们在性能和功耗方面评估了基于深度学习的精准农业应用程序在现代 Nvidia Jetson AGX Xavier 中进行霜冻预测的训练和推理阶段。我们的实验结果表明,云方法在性能方面还有很长的路要走,但是在边缘设备中包含 GPU 为那些连接仍然是挑战的场景提供了新的机会。
更新日期:2020-04-29
down
wechat
bug