当前位置: X-MOL 学术Precision Agric. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Red-green-blue to normalized difference vegetation index translation: a robust and inexpensive approach for vegetation monitoring using machine vision and generative adversarial networks
Precision Agriculture ( IF 6.2 ) Pub Date : 2023-03-07 , DOI: 10.1007/s11119-023-10001-3
Aitazaz A. Farooque , Hassan Afzaal , Rachid Benlamri , Salem Al-Naemi , Evan MacDonald , Farhat Abbas , Kaelyn MacLeod , Hassan Ali

High-resolution multispectral imaging of agricultural fields is expensive but helpful in detecting subtle variations in plant health and stress symptoms before the appearance of visible indications. To aid precision agriculture (PA) practices, an innovative and inexpensive protocol for robust and timely monitoring of vegetation symptoms has been evaluated. This innovative but inexpensive protocol uses machine vision (MV) and generative adversarial networks (GAN) to translate red-green-blue (RGB) imagery captured with unmanned aerial vehicle (UAV) into a valuable normalized difference vegetation index (NDVI) map. This study used direct translation of RGB imagery in NDVI index, in contrast with similar studies that used GANs in near-infrared (NIR) translation. The protocol was tested by flying a fixed-winged UAV developed by senseFly Inc. (Cheseaux-sur-Lausanne, Switzerland) model Ebee-X, equipped with a RedEdge-MX sensor, to capture images from five different potatoes fields located in Prince Edward Island – Canada, during the growing season of 2021. The images were captured throughout the growing season under vegetation (15–30 DAP; days after plantation), tuber formation (30–45 DAP), tuber bulking (75–110 DAP), and tuber maturation stages (> 110 DAP). The NDVI was calculated from captured UAV aerial surveys using NIR and red bands to develop pairwise datasets for the training of GANs. Five hundred pairwise images were used (80% training, 10% validation, and 10% testing) for training and evaluation of GANs. Two famous GANs, namely Pix2Pix and Pix2PixHD, were tested compared to various training and evaluation indicators. The Pix2PixHD outperformed Pix2Pix GAN by recording lower root mean square error (RMSE) (5.40 to 13.73) and higher structural similarity index matrix (SSIM) score (0.69 to 0.90) during the evaluation of the protocol. The results of this study are breakthroughs to be used for economic vegetation and orchard health monitoring after the training of models. The trained GANs can translate simple RGB domains into useful vegetation indices maps for variable rate PA practices. This innovative protocol can also translate remote sensing imagery of large-scale agricultural fields and commercial orchards into NDVI to extract useful information about plant health indicators.



中文翻译:

红绿蓝到归一化差异植被指数转换:一种使用机器视觉和生成对抗网络进行植被监测的稳健且廉价的方法

农田的高分辨率多光谱成像非常昂贵,但有助于在可见迹象出现之前检测植物健康和胁迫症状的细微变化。为了帮助精准农业 (PA) 实践,已经评估了一种创新且廉价的协议,用于可靠和及时地监测植被症状。这种创新但廉价的协议使用机器视觉 (MV) 和生成对抗网络 (GAN) 将无人机 (UAV) 捕获的红-绿-蓝 (RGB) 图像转换为有价值的归一化差异植被指数 (NDVI) 地图。本研究在 NDVI 指数中使用 RGB 图像的直接转换,与在近红外 (NIR) 转换中使用 GAN 的类似研究形成对比。该协议通过飞行由 senseFly Inc. 开发的固定翼无人机进行了测试。(瑞士洛桑河畔谢索)模型 Ebee-X,配备了 RedEdge-MX 传感器,用于在 2021 年的生长季节从位于加拿大爱德华王子岛的五个不同马铃薯田捕捉图像。这些图像是在整个种植季节拍摄的植被下的生长季节(15-30 DAP;种植后天数)、块茎形成(30-45 DAP)、块茎膨胀(75-110 DAP)和块茎成熟阶段(> 110 DAP)。NDVI 是根据使用 NIR 和红色波段捕获的无人机航测计算得出的,以开发用于训练 GAN 的成对数据集。使用了 500 张成对图像(80% 的训练、10% 的验证和 10% 的测试)来训练和评估 GAN。两个著名的 GAN,即 Pix2Pix 和 Pix2PixHD,通过各种训练和评估指标进行了测试。Pix2PixHD 在协议评估期间通过记录较低的均方根误差 (RMSE)(5.40 至 13.73)和较高的结构相似性指数矩阵 (SSIM) 得分(0.69 至 0.90)优于 Pix2Pix GAN。该研究成果有望在模型训练后用于经济植被和果园健康监测。经过训练的 GAN 可以将简单的 RGB 域转换为有用的植被指数图,用于可变速率 PA 实践。这种创新协议还可以将大型农田和商业果园的遥感图像转化为 NDVI,以提取有关植物健康指标的有用信息。该研究成果有望在模型训练后用于经济植被和果园健康监测。经过训练的 GAN 可以将简单的 RGB 域转换为有用的植被指数图,用于可变速率 PA 实践。这种创新协议还可以将大型农田和商业果园的遥感图像转化为 NDVI,以提取有关植物健康指标的有用信息。该研究成果有望在模型训练后用于经济植被和果园健康监测。经过训练的 GAN 可以将简单的 RGB 域转换为有用的植被指数图,用于可变速率 PA 实践。这种创新协议还可以将大型农田和商业果园的遥感图像转化为 NDVI,以提取有关植物健康指标的有用信息。

更新日期:2023-03-08
down
wechat
bug