Skip to main content

Advertisement

Log in

Automation in Agriculture by Machine and Deep Learning Techniques: A Review of Recent Developments

  • Review
  • Published:
Precision Agriculture Aims and scope Submit manuscript

A Correction to this article was published on 21 June 2021

This article has been updated

Abstract

Recently, agriculture has gained much attention regarding automation by artificial intelligence techniques and robotic systems. Particularly, with the advancements in machine learning (ML) concepts, significant improvements have been observed in agricultural tasks. The ability of automatic feature extraction creates an adaptive nature in deep learning (DL), specifically convolutional neural networks to achieve human-level accuracy in various agricultural applications, prominent among which are plant disease detection and classification, weed/crop discrimination, fruit counting, land cover classification, and crop/plant recognition. This review presents the performance of recent uses in agricultural robots by the implementation of ML and DL algorithms/architectures during the last decade. Performance plots are drawn to study the effectiveness of deep learning over traditional machine learning models for certain agricultural operations. The analysis of prominent studies highlighted that the DL-based models, like RCNN (Region-based Convolutional Neural Network), achieve a higher plant disease/pest detection rate (82.51%) than the well-known ML algorithms, including Multi-Layer Perceptron (64.9%) and K-nearest Neighbour (63.76%). The famous DL architecture named ResNet-18 attained more accurate Area Under the Curve (94.84%), and outperformed ML-based techniques, including Random Forest (RF) (70.16%) and Support Vector Machine (SVM) (60.6%), for crop/weed discrimination. Another DL model called FCN (Fully Convolutional Networks) recorded higher accuracy (83.9%) than SVM (67.6%) and RF (65.6%) algorithms for the classification of agricultural land covers. Finally, some important research gaps from the previous studies and innovative future directions are also noted to help propel automation in agriculture up to the next level.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

Change history

References

  • Adhikari, S. P., Yang, H., & Kim, H. (2019). Learning semantic graphics using convolutional encoder-decoder network for autonomous weeding in paddy field. Frontiers in Plant Science, 10, 1404

    Article  PubMed  PubMed Central  Google Scholar 

  • Al Ohali, Y. (2011). Computer vision based date fruit grading system: Design and implementation. Journal of King Saud University-Computer and Information Sciences, 23(1), 29–36

    Article  Google Scholar 

  • Alexandridis, T. K., Tamouridou, A. A., Pantazi, X. E., Lagopodi, A. L., Kashefi, J., Ovakoglou, G., et al. (2017). Novelty detection classifiers in weed mapping: Silybum marianum detection on UAV multispectral images. Sensors, 17(9), 2007

    Article  PubMed Central  CAS  Google Scholar 

  • Altaheri, H., Alsulaiman, M., & Muhammad, G. (2019). Date fruit classification for robotic harvesting in a natural environment using deep learning. IEEE Access, 7, 117115–117133

    Article  Google Scholar 

  • Ampatzidis, Y., De Bellis, L., & Luvisi, A. (2017). iPathology: robotic applications and management of plants and plant diseases. Sustainability, 9(6), 1010

    Article  Google Scholar 

  • Ampatzidis, Y., & Partel, V. (2019). UAV-based high throughput phenotyping in citrus utilizing multispectral imaging and artificial intelligence. Remote Sensing, 11(4), 410

    Article  Google Scholar 

  • Arefi, A., & Motlagh, A. M. (2013). Development of an expert system based on wavelet transform and artificial neural networks for the ripe tomato harvesting robot. Australian Journal of Crop Science, 7(5), 699

    Google Scholar 

  • Azouz, A. B., Esmonde, H., Corcoran, B., & O’Callaghan, E. (2015). Development of a teat sensing system for robotic milking by combining thermal imaging and stereovision technique. Computers and Electronics in Agriculture, 110, 162–170

    Article  Google Scholar 

  • Bac, C. W., van Henten, E. J., Hemming, J., & Edan, Y. (2014). Harvesting robots for high-value crops: State-of-the-art review and challenges ahead. Journal of Field Robotics, 31(6), 888–911

    Article  Google Scholar 

  • Bah, M. D., Hafiane, A., & Canals, R. (2018). Deep learning with unsupervised data labeling for weed detection in line crops in UAV images. Remote Sensing, 10(11), 1690

    Article  Google Scholar 

  • Bah, M. D., Hafiane, A., Canals, R., & Emile, B. (2019). Deep features and One-class classification with unsupervised data for weed detection in UAV images. In Ninth International Conference on Image Processing Theory, Tools and Applications (IPTA), 2019 (pp. 1–5). Istanbul, Turkey: IEEE.

  • Bakhshipour, A., & Jafari, A. (2018). Evaluation of support vector machine and artificial neural networks in weed detection using shape features. Computers and Electronics in Agriculture, 145, 153–160

    Article  Google Scholar 

  • Bargoti, S., & Underwood, J. (2017a). Deep fruit detection in orchards. In IEEE International Conference on Robotics and Automation (ICRA), 2017 (pp. 3626–3633). Marina Bay Sands, Singapore: IEEE.

  • Bargoti, S., & Underwood, J. P. (2017b). Image segmentation for fruit detection and yield estimation in apple orchards. Journal of Field Robotics, 34(6), 1039–1060

    Article  Google Scholar 

  • Barker, J., Sarathy, S., & July, A. (2016). DetectNet: Deep Neural Network for Object Detection in DIGITS. Nvidia,(retrieved: 2016–11–30). Retrieved from https://devblogs.nvidia.com/parallelforall/detectnet-deep-neural-network-object-detection-digits.

  • Baweja, H. S., Parhar, T., Mirbod, O., & Nuske, S. Stalknet: A deep learning pipeline for high-throughput measurement of plant stalk count and stalk width. In Field and Service Robotics, 2018 (pp. 271–284): Springer.

  • Behmann, J., Mahlein, A.-K., Rumpf, T., Römer, C., & Plümer, L. (2015). A review of advanced machine learning methods for the detection of biotic stress in precision crop protection. Precision Agriculture, 16(3), 239–260

    Article  Google Scholar 

  • Bierman, A., LaPlumm, T., Cadle-Davidson, L., Gadoury, D., Martinez, D., Sapkota, S., et al. (2019). A high-throughput phenotyping system using machine vision to quantify severity of grapevine powdery mildew. Plant Phenomics, 2019, 9209727

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  • Birrell, S., Hughes, J., Cai, J. Y., & Iida, F. (2019). A field-tested robotic harvesting system for iceberg lettuce. Journal of Field Robotics, 37, 225–245

    Article  PubMed  PubMed Central  Google Scholar 

  • Brahimi, M., Arsenovic, M., Laraba, S., Sladojevic, S., Boukhalfa, K., & Moussaoui, A. (2018). Deep learning for plant diseases: detection and saliency map visualisation. In Human and Machine Learning (pp. 93–117): Springer.

  • Carrijo, G. L., Oliveira, D. E., de Assis, G. A., Carneiro, M. G., Guizilini, V. C., & Souza, J. R. (2017). Automatic detection of fruits in coffee crops from aerial images. In Latin American Robotics Symposium (LARS) and 2017 Brazilian Symposium on Robotics (SBR), 2017 (pp. 1–6). Curitiba, PR, Brazil: IEEE.

  • Chen, L.-C., Papandreou, G., Schroff, F., & Adam, H. (2017). Rethinking atrous convolution for semantic image segmentation. arXiv preprint arXiv:1706.05587.

  • Chen, Y., Lee, W. S., Gan, H., Peres, N., Fraisse, C., Zhang, Y., et al. (2019). Strawberry Yield Prediction Based on a Deep Neural Network Using High-Resolution Aerial Orthoimages. Remote Sensing, 11(13), 1584

    Article  Google Scholar 

  • Cheng, B., & Matson, E. T. (2015). A feature-based machine learning agent for automatic rice and weed discrimination. In International Conference on Artificial Intelligence and Soft Computing, 2015 (pp. 517–527). Zakopane, Poland: Springer.

  • Cho, S., Chang, S., Kim, Y., & An, K. (2002). Development of a three-degrees-of-freedom robot for harvesting lettuce using machine vision and fuzzy logic control. Biosystems Engineering, 82(2), 143–149

    Article  Google Scholar 

  • Cho, S., Lee, D., & Jeong, J. (2002). AE—automation and emerging technologies: Weed–plant discrimination by machine vision and artificial neural network. Biosystems Engineering, 83(3), 275–280

    Article  Google Scholar 

  • Chollet, F. (2017). Xception: Deep learning with depthwise separable convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2017 (pp. 1251–1258). Honolulu, HI, USA.

  • Csillik, O., Cherbini, J., Johnson, R., Lyons, A., & Kelly, M. (2018). Identification of citrus trees from unmanned aerial vehicle imagery using convolutional neural networks. Drones, 2(4), 39

    Article  Google Scholar 

  • da Costa, A. Z., Figueroa, H. E., & Fracarolli, J. A. (2020). Computer vision based detection of external defects on tomatoes using deep learning. Biosystems Engineering, 190, 131–144

    Article  Google Scholar 

  • Dang, L. M., Hassan, S. I., Suhyeon, I., Kumar Sangaiah, A., Mehmood, I., Rho, S., et al. (2018). UAV based wilt detection system via convolutional neural networks. Sustainable Computing: Informatics and Systems. https://doi.org/10.1016/j.suscom.2018.05.010

    Article  Google Scholar 

  • De-An, Z., Jidong, L., Wei, J., Ying, Z., & Yu, C. (2011). Design and control of an apple harvesting robot. Biosystems Engineering, 110(2), 112–122

    Article  Google Scholar 

  • Di Cicco, M., Potena, C., Grisetti, G., & Pretto, A. (2017). Automatic model based dataset generation for fast and accurate crop and weeds detection. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2017 (pp. 5188–5195). Vancouver, BC, Canada: IEEE.

  • dos Santos Ferreira, A., Freitas, D. M., da Silva, G. G., Pistori, H., & Folhes, M. T. (2017). Weed detection in soybean crops using ConvNets. Computers and Electronics in Agriculture, 143, 314–324

    Article  Google Scholar 

  • dos Santos Ferreira, A., Freitas, D. M., da Silva, G. G., Pistori, H., & Folhes, M. T. (2019). Unsupervised deep learning and semi-automatic data labeling in weed discrimination. Computers and Electronics in Agriculture, 165, 104963

    Article  Google Scholar 

  • Duro, D. C., Franklin, S. E., & Dubé, M. G. (2012). A comparison of pixel-based and object-based image analysis with selected machine learning algorithms for the classification of agricultural landscapes using SPOT-5 HRG imagery. Remote Sensing of Environment, 118, 259–272

    Article  Google Scholar 

  • Dyrmann, M., Christiansen, P., & Midtiby, H. S. (2018). Estimation of plant species by classifying plants and leaves in combination. Journal of Field Robotics, 35(2), 202–212

    Article  Google Scholar 

  • Dyrmann, M., Jørgensen, R. N., & Midtiby, H. S. (2017). RoboWeedSupport-Detection of weed locations in leaf occluded cereal crops using a fully convolutional neural network. Advances in Animal Biosciences, 8(2), 842–847

    Article  Google Scholar 

  • Dyrmann, M., Karstoft, H., & Midtiby, H. S. (2016). Plant species classification using deep convolutional neural network. Biosystems Engineering, 151, 72–80

    Article  Google Scholar 

  • Ebrahimi, M., Khoshtaghaza, M., Minaei, S., & Jamshidi, B. (2017). Vision-based pest detection based on SVM classification method. Computers and Electronics in Agriculture, 137, 52–58

    Article  Google Scholar 

  • Eisavi, V., Homayouni, S., Yazdi, A. M., & Alimohammadi, A. (2015). Land cover mapping based on random forest classification of multitemporal spectral and thermal images. Environmental Monitoring and Assessment, 187(5), 291

    Article  PubMed  Google Scholar 

  • Esgario, J. G., Krohling, R. A., & Ventura, J. A. (2020). Deep learning for classification and severity estimation of coffee leaf biotic stress. Computers and Electronics in Agriculture, 169, 105162

    Article  Google Scholar 

  • Fan, Z., Lu, J., Gong, M., Xie, H., & Goodman, E. D. (2018). Automatic tobacco plant detection in UAV images via deep neural networks. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 11(3), 876–887

    Article  Google Scholar 

  • Feng, Q., Wang, X., Wang, G., & Li, Z. (2015). Design and test of tomatoes harvesting robot. In IEEE International Conference on Information and Automation, 2015 (pp. 949–952). Lijiang, Yunnan, China: IEEE.

  • Fuentes-Pacheco, J., Torres-Olivares, J., Roman-Rangel, E., Cervantes, S., Juarez-Lopez, P., Hermosillo-Valadez, J., et al. (2019). Fig plant segmentation from aerial images using a deep convolutional encoder-decoder network. Remote Sensing, 11(10), 1157

    Article  Google Scholar 

  • Gao, J., Nuyttens, D., Lootens, P., He, Y., & Pieters, J. G. (2018). Recognising weeds in a maize crop using a random forest machine-learning algorithm and near-infrared snapshot mosaic hyperspectral imagery. Biosystems Engineering, 170, 39–50

    Article  Google Scholar 

  • Ghosal, S., Zheng, B., Chapman, S. C., Potgieter, A. B., Jordan, D. R., Wang, X., et al. (2019). A weakly supervised deep learning framework for sorghum head detection and counting. Plant Phenomics, 2019, 1525874

    Article  PubMed  PubMed Central  Google Scholar 

  • Girshick, R. (2015). Fast r-cnn. In Proceedings of the IEEE International Conference on Computer Vision, 2015 (pp. 1440–1448). Santiago, Chile: IEEE.

  • Gongal, A., Amatya, S., Karkee, M., Zhang, Q., & Lewis, K. (2015). Sensors and systems for fruit detection and localization: A review. Computers and Electronics in Agriculture, 116, 8–19

    Article  Google Scholar 

  • Guidici, D., & Clark, M. L. (2017). One-Dimensional convolutional neural network land-cover classification of multi-seasonal hyperspectral imagery in the San Francisco Bay Area, California. Remote Sensing, 9(6), 629

    Article  Google Scholar 

  • Gutierrez, A., Ansuategi, A., Susperregi, L., Tubío, C., Rankić, I., & Lenža, L. (2019). A benchmarking of learning strategies for pest detection and identification on tomato plants for autonomous scouting robots using internal databases. Journal of Sensors. https://doi.org/10.1155/2019/5219471

    Article  Google Scholar 

  • Gutiérrez, S., Fernández-Novales, J., Diago, M. P., & Tardaguila, J. (2018). On-the-go hyperspectral imaging under field conditions and machine learning for the classification of grapevine varieties. Frontiers in Plant Science, 9, 1102

    Article  PubMed  PubMed Central  Google Scholar 

  • Ha, J. G., Moon, H., Kwak, J. T., Hassan, S. I., Dang, M., Lee, O. N., et al. (2017). Deep convolutional neural network for classifying Fusarium wilt of radish from unmanned aerial vehicles. Journal of Applied Remote Sensing, 11(4), 042621

    Article  Google Scholar 

  • Hall, D., Dayoub, F., Kulk, J., & McCool, C. (2017). Towards unsupervised weed scouting for agricultural robotics. In IEEE International Conference on Robotics and Automation (ICRA), 2017 (pp. 5223–5230). Marina Bay Sands, Singapore: IEEE.

  • Halstead, M., McCool, C., Denman, S., Perez, T., & Fookes, C. (2018). Fruit quantity and ripeness estimation using a robotic vision system. IEEE Robotics and Automation Letters, 3(4), 2995–3002

    Article  Google Scholar 

  • Haug, S., Michaels, A., Biber, P., & Ostermann, J. (2014). Plant classification system for crop/weed discrimination without segmentation. In IEEE winter conference on applications of computer vision, 2014 (pp. 1142–1149). Steamboat Springs, CO, USA: IEEE.

  • He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, 2016 (pp. 770–778). Las Vegas, NV, USA: IEEE.

  • Helber, P., Bischke, B., Dengel, A., & Borth, D. (2019). Eurosat: A novel dataset and deep learning benchmark for land use and land cover classification. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 12(7), 2217–2226

    Article  Google Scholar 

  • Heremans, S., & Van Orshoven, J. (2015). Machine learning methods for sub-pixel land-cover classification in the spatially heterogeneous region of Flanders (Belgium): A multi-criteria comparison. International Journal of Remote Sensing, 36(11), 2934–2962

    Article  Google Scholar 

  • Horng, G.-J., Liu, M.-X., & Chen, C.-C. (2019). The smart image recognition mechanism for crop harvesting system in intelligent agriculture. IEEE Sensors Journal, 20, 2766–2781

    Article  Google Scholar 

  • Huang, B., Zhao, B., & Song, Y. (2018). Urban land-use mapping using a deep convolutional neural network with high spatial resolution multispectral remote sensing imagery. Remote Sensing of Environment, 214, 73–86

    Article  Google Scholar 

  • Huang, G., Liu, Z., Van Der Maaten, L., & Weinberger, K. Q. (2017). Densely connected convolutional networks. In Proceedings of the IEEE conference on computer vision and pattern recognition, 2017 (pp. 4700–4708). Honolulu, HI, USA: IEEE.

  • Huang, M., Tang, J., Yang, B., & Zhu, Q. (2016). Classification of maize seeds of different years based on hyperspectral imaging and model updating. Computers and Electronics in Agriculture, 122, 139–145

    Article  Google Scholar 

  • Huang, Y., Lan, Y., Thomson, S. J., Fang, A., Hoffmann, W. C., & Lacey, R. E. (2010). Development of soft computing and applications in agricultural and biological engineering. Computers and Electronics in Agriculture, 71(2), 107–127

    Article  Google Scholar 

  • Ienco, D., Gaetano, R., Dupaquier, C., & Maurel, P. (2017). Land cover classification via multitemporal spatial data by deep recurrent neural networks. IEEE Geoscience and Remote Sensing Letters, 14(10), 1685–1689

    Article  Google Scholar 

  • Ishimwe, R., Abutaleb, K., & Ahmed, F. (2014). Applications of thermal imaging in agriculture: A review. Advances in Remote Sensing, 3(03), 128

    Article  Google Scholar 

  • Jeon, H. Y., Tian, L. F., & Zhu, H. (2011). Robust crop and weed segmentation under uncontrolled outdoor illumination. Sensors, 11(6), 6270–6283

    Article  PubMed  PubMed Central  Google Scholar 

  • Jha, K., Doshi, A., Patel, P., & Shah, M. (2019). A comprehensive review on automation in agriculture using artificial intelligence. Artificial Intelligence in Agriculture, 2, 1–12

    Article  Google Scholar 

  • Ji, S., Zhang, C., Xu, A., Shi, Y., & Duan, Y. (2018). 3D convolutional neural networks for crop classification with multi-temporal remote sensing images. Remote Sensing, 10(1), 75

    Article  Google Scholar 

  • Ji, W., Zhao, D., Cheng, F., Xu, B., Zhang, Y., & Wang, J. (2012). Automatic recognition vision system guided for apple harvesting robot. Computers & Electrical Engineering, 38(5), 1186–1195

    Article  Google Scholar 

  • Jia, W., Mou, S., Wang, J., Liu, X., Zheng, Y., Lian, J., et al. (2020). Fruit recognition based on pulse coupled neural network and genetic Elman algorithm application in apple harvesting robot. International Journal of Advanced Robotic Systems, 17(1), 1729881419897473

    Article  Google Scholar 

  • Jodas, D. S., Marranghello, N., Pereira, A. S., & Guido, R. C. (2013). Comparing support vector machines and artificial neural networks in the recognition of steering angle for driving of mobile robots through paths in plantations. Procedia Computer Science, 18, 240–249

    Article  Google Scholar 

  • Joffe, B., Ahlin, K., Hu, A.-P., & McMurray, G. (2018). Vision-guided robotic leaf picking. EasyChair Preprint, 250, 1–6

    Google Scholar 

  • Kamilaris, A., & Prenafeta-Boldú, F. X. (2018). Deep learning in agriculture: A survey. Computers and Electronics in Agriculture, 147, 70–90

    Article  Google Scholar 

  • Kazerouni, M. F., Saeed, N. T. M., & Kuhnert, K.-D. (2019). Fully-automatic natural plant recognition system using deep neural network for dynamic outdoor environments. SN Applied Sciences, 1(7), 756

    Article  Google Scholar 

  • Kerkech, M., Hafiane, A., & Canals, R. (2019). Vine disease detection in UAV multispectral images with deep learning segmentation approach. arXiv preprint arXiv:1912.05281.

  • Kitano, B. T., Mendes, C. C., Geus, A. R., Oliveira, H. C., & Souza, J. R. (2019). Corn Plant Counting Using Deep Learning and UAV Images. IEEE Geoscience and Remote Sensing Letters.

  • Krizhevsky, A., Sutskever, I., & Hinton, G. E. Imagenet classification with deep convolutional neural networks. In Advances in neural information processing systems, 2012 (pp. 1097–1105).

  • Kurtulmus, F., Lee, W. S., & Vardar, A. (2011). Green citrus detection using ‘eigenfruit’, color and circular Gabor texture features under natural outdoor conditions. Computers and Electronics in Agriculture, 78(2), 140–149

    Article  Google Scholar 

  • Kussul, N., Lavreniuk, M., Skakun, S., & Shelestov, A. (2017). Deep learning classification of land cover and crop types using remote sensing data. IEEE Geoscience and Remote Sensing Letters, 14(5), 778–782

    Article  Google Scholar 

  • Kusumam, K., Krajník, T., Pearson, S., Cielniak, G., & Duckett, T. (2016). Can you pick a broccoli? 3D-vision based detection and localisation of broccoli heads in the field. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2016 (pp. 646–651). Daejeon Convention Center (DCC), Daejeon, South Korea: IEEE.

  • Kusumam, K., Krajník, T., Pearson, S., Duckett, T., & Cielniak, G. (2017). 3D-vision based detection, localization, and sizing of broccoli heads in the field. Journal of Field Robotics, 34(8), 1505–1518

    Article  Google Scholar 

  • Kwak, G.-H., & Park, N.-W. (2019). Impact of texture information on crop classification with machine learning and UAV images. Applied Sciences, 9(4), 643

    Article  Google Scholar 

  • Lee, S. H., Chan, C. S., Mayo, S. J., & Remagnino, P. (2017). How deep learning extracts and learns leaf features for plant classification. Pattern Recognition, 71, 1–13

    Article  Google Scholar 

  • Lee, S. H., Chan, C. S., & Remagnino, P. (2018). Multi-organ plant classification based on convolutional and recurrent neural networks. IEEE Transactions on Image Processing, 27(9), 4287–4301

    Article  PubMed  Google Scholar 

  • Li, P., Lee, S.-H., & Hsu, H.-Y. (2011). Review on fruit harvesting method for potential use of automatic fruit harvesting systems. Procedia Engineering, 23, 351–366

    Article  CAS  Google Scholar 

  • Li, Y., Wang, H., Dang, L. M., Sadeghi-Niaraki, A., & Moon, H. (2020). Crop pest recognition in natural scenes using convolutional neural networks. Computers and Electronics in Agriculture, 169, 105174

    Article  Google Scholar 

  • Liu, B., Zhang, Y., He, D., & Li, Y. (2018). Identification of apple leaf diseases based on deep convolutional neural networks. Symmetry, 10(1), 11

    Article  CAS  Google Scholar 

  • Liu, G., Mao, S., & Kim, J. H. (2019). A mature-tomato detection algorithm using machine learning and color analysis. Sensors, 19(9), 2023

    Article  PubMed Central  Google Scholar 

  • Liu, J., Pi, J., & Xia, L. (2019). A novel and high precision tomato maturity recognition algorithm based on multi-level deep residual network. Multimedia Tools and Applications, 79, 9403–9417

    Article  Google Scholar 

  • Liu, T., Abd-Elrahman, A., Morton, J., & Wilhelm, V. L. (2018). Comparing fully convolutional networks, random forest, support vector machine, and patch-based deep convolutional neural networks for object-based wetland mapping using images from small unmanned aircraft system. GIScience & Remote Sensing, 55(2), 243–264

    Article  Google Scholar 

  • Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C.-Y., et al. Ssd: Single shot multibox detector. In European conference on computer vision, 2016 (pp. 21–37). Amsterdam, Netherlands: Springer.

  • Long, J., Shelhamer, E., & Darrell, T. (2015). Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition, 2015 (pp. 3431–3440). Boston, MA, USA: IEEE.

  • Lottes, P., Behley, J., Chebrolu, N., Milioto, A., & Stachniss, C. (2018). Joint stem detection and crop-weed classification for plant-specific treatment in precision farming. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018a (pp. 8233–8238). Madrid, Spain: IEEE.

  • Lottes, P., Behley, J., Milioto, A., & Stachniss, C. (2018). Fully convolutional networks with sequential information for robust crop and weed detection in precision farming. IEEE Robotics and Automation Letters, 3(4), 2870–2877

    Article  Google Scholar 

  • Lottes, P., Hoeferlin, M., Sander, S., Müter, M., Schulze, P., & Stachniss, L. C. (2016). An effective classification system for separating sugar beets and weeds for precision farming applications. In IEEE International Conference on Robotics and Automation (ICRA), 2016 (pp. 5157–5163). Stockholm Waterfront Congress Centre, Stockholm, Sweden: IEEE.

  • Lottes, P., Khanna, R., Pfeifer, J., Siegwart, R., & Stachniss, C. (2017). UAV-based crop and weed classification for smart farming. In IEEE International Conference on Robotics and Automation (ICRA), 2017 (pp. 3024–3031). Marina Bay Sands, Singapore: IEEE.

  • Luus, F. P., Salmon, B. P., Van den Bergh, F., & Maharaj, B. T. J. (2015). Multiview deep learning for land-use classification. IEEE Geoscience and Remote Sensing Letters, 12(12), 2448–2452

    Article  Google Scholar 

  • Mahdianpari, M., Salehi, B., Rezaee, M., Mohammadimanesh, F., & Zhang, Y. (2018). Very deep convolutional neural networks for complex land cover mapping using multispectral remote sensing imagery. Remote Sensing, 10(7), 1119

    Article  Google Scholar 

  • Mahlein, A., Kuska, M., Thomas, S., Bohnenkamp, D., Alisaac, E., Behmann, J., et al. (2017). Plant disease detection by hyperspectral imaging: From the lab to the field. Advances in Animal Biosciences, 8(2), 238–243

    Article  Google Scholar 

  • Mao, S., Li, Y., Ma, Y., Zhang, B., Zhou, J., & Wang, K. (2020). Automatic cucumber recognition algorithm for harvesting robots in the natural environment using deep learning and multi-feature fusion. Computers and Electronics in Agriculture, 170, 105254

    Article  Google Scholar 

  • Marani, R., Milella, A., Petitti, A., & Reina, G. (2020). Deep neural networks for grape bunch segmentation in natural images from a consumer-grade camera. Precision Agriculture, 22, 387–413

    Article  Google Scholar 

  • McCool, C., Perez, T., & Upcroft, B. (2017). Mixtures of lightweight deep convolutional neural networks: Applied to agricultural robotics. IEEE Robotics and Automation Letters, 2(3), 1344–1351

    Article  Google Scholar 

  • Milella, A., Marani, R., Petitti, A., & Reina, G. (2019). In-field high throughput grapevine phenotyping with a consumer-grade depth camera. Computers and Electronics in Agriculture, 156, 293–306

    Article  Google Scholar 

  • Milella, A., Reina, G., & Nielsen, M. (2019). A multi-sensor robotic platform for ground mapping and estimation beyond the visible spectrum. Precision Agriculture, 20(2), 423–444

    Article  Google Scholar 

  • Milella, A., Reina, G., Underwood, J., & Douillard, B. Combining radar and vision for self-supervised ground segmentation in outdoor environments. In IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011 (pp. 255–260). San Francisco, CA, USA: IEEE.

  • Milioto, A., Lottes, P., & Stachniss, C. (2018). Real-time semantic segmentation of crop and weed for precision agriculture robots leveraging background knowledge in CNNs. In IEEE International Conference on Robotics and Automation (ICRA), 2018 (pp. 2229–2235). Brisbane, Australia: IEEE.

  • Narvaez, F. Y., Reina, G., Torres-Torriti, M., Kantor, G., & Cheein, F. A. (2017). A survey of ranging and imaging techniques for precision agriculture phenotyping. IEEE/ASME Transactions on Mechatronics, 22(6), 2428–2439

    Article  Google Scholar 

  • Nashat, S., Abdullah, A., Aramvith, S., & Abdullah, M. (2011). Support vector machine approach to real-time inspection of biscuits on moving conveyor belt. Computers and Electronics in Agriculture, 75(1), 147–158

    Article  Google Scholar 

  • Ndikumana, E., Ho Tong Minh, D., Baghdadi, N., Courault, D., & Hossard, L. (2018). Deep recurrent neural network for agricultural classification using multitemporal SAR Sentinel-1 for Camargue. France. Remote Sensing, 10(8), 1217

    Article  Google Scholar 

  • Nkemelu, D. K., Omeiza, D., & Lubalo, N. (2018). Deep convolutional neural network for plant seedlings classification. arXiv preprint arXiv:1811.08404.

  • Ok, A. O., Akar, O., & Gungor, O. (2012). Evaluation of random forest method for agricultural crop classification. European Journal of Remote Sensing, 45(1), 421–432

    Article  Google Scholar 

  • Olsen, A., Konovalov, D. A., Philippa, B., Ridd, P., Wood, J. C., Johns, J., et al. (2019). DeepWeeds: A multiclass weed species image dataset for deep learning. Scientific Reports, 9(1), 1–12

    Article  Google Scholar 

  • Onishi, Y., Yoshida, T., Kurita, H., Fukao, T., Arihara, H., & Iwai, A. (2019). An automated fruit harvesting robot by using deep learning. ROBOMECH Journal, 6(1), 13

    Article  Google Scholar 

  • Padarian, J., Minasny, B., & McBratney, A. (2019). Using deep learning to predict soil properties from regional spectral data. Geoderma Regional, 16, e00198

    Article  Google Scholar 

  • Pal, M. (2009). Extreme-learning-machine-based land cover classification. International Journal of Remote Sensing, 30(14), 3835–3841

    Article  Google Scholar 

  • Pantazi, X. E., Moshou, D., & Tamouridou, A. A. (2019). Automated leaf disease detection in different crop species through image features analysis and One Class Classifiers. Computers and Electronics in Agriculture, 156, 96–104

    Article  Google Scholar 

  • Partel, V., Kakarla, S. C., & Ampatzidis, Y. (2019). Development and evaluation of a low-cost and smart technology for precision weed management utilizing artificial intelligence. Computers and Electronics in Agriculture, 157, 339–350

    Article  Google Scholar 

  • Patrício, D. I., & Rieder, R. (2018). Computer vision and artificial intelligence in precision agriculture for grain crops: A systematic review. Computers and Electronics in Agriculture, 153, 69–81

    Article  Google Scholar 

  • Patrick, A., Pelham, S., Culbreath, A., Holbrook, C. C., De Godoy, I. J., & Li, C. (2017). High throughput phenotyping of tomato spot wilt disease in peanuts using unmanned aerial systems and multispectral imaging. IEEE Instrumentation & Measurement Magazine, 20(3), 4–12

    Article  Google Scholar 

  • Peña, J. M., Gutiérrez, P. A., Hervás-Martínez, C., Six, J., Plant, R. E., & López-Granados, F. (2014). Object-based image classification of summer crops with machine learning methods. Remote Sensing, 6(6), 5019–5041

    Article  Google Scholar 

  • Polder, G., Blok, P. M., de Villiers, H., van der Wolf, J. M., & Kamp, J. (2019). Potato virus y detection in seed potatoes using deep learning on hyperspectral images. Frontiers in Plant Science, 10, 209

    Article  PubMed  PubMed Central  Google Scholar 

  • Potena, C., Nardi, D., & Pretto, A. (2016). Fast and accurate crop and weed identification with summarized train sets for precision agriculture. In International Conference on Intelligent Autonomous Systems, 2016 (pp. 105–121). Shanghai, China: Springer.

  • Pourazar, H., Samadzadegan, F., & Javan, F. D. (2019). Aerial Multispectral Imagery for Plant Disease Detection; Radiometric Calibration Necessity Assessment.

  • Quiroz, I. A., & Alférez, G. H. (2020). Image recognition of Legacy blueberries in a Chilean smart farm through deep learning. Computers and Electronics in Agriculture, 168, 105044

    Article  Google Scholar 

  • Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition, 2016 (pp. 779–788). Las Vegas, NV, USA: IEEE.

  • Redmon, J., & Farhadi, A. (2017). YOLO9000: better, faster, stronger. In Proceedings of the IEEE conference on computer vision and pattern recognition, 2017 (pp. 7263–7271). Honolulu, HI, USA: IEEE.

  • Redmon, J., & Farhadi, A. (2018). Yolov3: An incremental improvement. arXiv preprint arXiv:1804.02767.

  • Rehman, T. U., Mahmud, M. S., Chang, Y. K., Jin, J., & Shin, J. (2019). Current and future applications of statistical machine learning algorithms for agricultural machine vision systems. Computers and Electronics in Agriculture, 156, 585–605

    Article  Google Scholar 

  • Reina, G., Milella, A., & Galati, R. (2017). Terrain assessment for precision agriculture using vehicle dynamic modelling. Biosystems Engineering, 162, 124–139

    Article  Google Scholar 

  • Reina, G., Milella, A., Rouveure, R., Nielsen, M., Worst, R., & Blas, M. R. (2016). Ambient awareness for agricultural robotic vehicles. Biosystems Engineering, 146, 114–132

    Article  Google Scholar 

  • Ren, S., He, K., Girshick, R., & Sun, J. Faster r-cnn: Towards real-time object detection with region proposal networks. In Advances in neural information processing systems, 2015 (pp. 91–99).

  • Rodriguez-Galiano, V. F., Ghimire, B., Rogan, J., Chica-Olmo, M., & Rigol-Sanchez, J. P. (2012). An assessment of the effectiveness of a random forest classifier for land-cover classification. ISPRS Journal of Photogrammetry and Remote Sensing, 67, 93–104

    Article  Google Scholar 

  • Ronneberger, O., Fischer, P., & Brox, T. (2015). U-net: Convolutional networks for biomedical image segmentation. In International Conference on Medical image computing and computer-assisted intervention, 2015 (pp. 234–241). Munich, Germany: Springer.

  • Sa, I., Chen, Z., Popović, M., Khanna, R., Liebisch, F., Nieto, J., et al. (2017). weednet: Dense semantic weed classification using multispectral images and mav for smart farming. IEEE Robotics and Automation Letters, 3(1), 588–595

    Article  Google Scholar 

  • Sa, I., Ge, Z., Dayoub, F., Upcroft, B., Perez, T., & McCool, C. (2016). Deepfruits: A fruit detection system using deep neural networks. Sensors, 16(8), 1222

    Article  PubMed Central  Google Scholar 

  • Saleem, M. H., Potgieter, J., & Arif, K. M. (2019). Plant disease detection and classification by deep learning. Plants, 8(11), 468

    Article  PubMed Central  Google Scholar 

  • Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., & Chen, L.-C. (2018). Mobilenetv2: Inverted residuals and linear bottlenecks. In Proceedings of the IEEE conference on computer vision and pattern recognition, 2018 (pp. 4510–4520). Salt Lake City, UT, USA: IEEE.

  • Santos, L., Santos, F. N., Oliveira, P. M., & Shinde, P. Deep learning applications in agriculture: A short review. In Fourth Iberian Robotics conference, 2019 (pp. 139–151). Porto, Portugal: Springer.

  • Sarkar, S. K., Das, J., Ehsani, R., & Kumar, V. (2016). Towards autonomous phytopathology: Outcomes and challenges of citrus greening disease detection through close-range remote sensing. In IEEE International Conference on Robotics and Automation (ICRA), 2016 (pp. 5143–5148). Stockholm, Sweden: IEEE.

  • Sengupta, S., & Lee, W. S. (2014). Identification and determination of the number of immature green citrus fruit in a canopy under different ambient light conditions. Biosystems Engineering, 117, 51–61

    Article  Google Scholar 

  • Shao, Y., & Lunetta, R. S. (2012). Comparison of support vector machine, neural network, and CART algorithms for the land-cover classification using limited training data points. ISPRS Journal of Photogrammetry and Remote Sensing, 70, 78–87

    Article  Google Scholar 

  • Sharif, M., Khan, M. A., Iqbal, Z., Azam, M. F., Lali, M. I. U., & Javed, M. Y. (2018). Detection and classification of citrus diseases in agriculture based on optimized weighted segmentation and feature selection. Computers and Electronics in Agriculture, 150, 220–234

    Article  Google Scholar 

  • Simonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556.

  • Singh, U. P., Chouhan, S. S., Jain, S., & Jain, S. (2019). Multilayer convolution neural network for the classification of mango leaves infected by anthracnose disease. IEEE Access, 7, 43721–43729

    Article  Google Scholar 

  • Sladojevic, S., Arsenovic, M., Anderla, A., Culibrk, D., & Stefanovic, D. (2016). Deep neural networks based recognition of plant diseases by leaf image classification. Computational Intelligence and Neuroscience. https://doi.org/10.1155/2016/3289801

    Article  PubMed  PubMed Central  Google Scholar 

  • Slaughter, D. C., Giles, D. K., Fennimore, S. A., & Smith, R. F. (2008). Multispectral machine vision identification of lettuce and weed seedlings for automated weed control. Weed Technology, 22(2), 378–384

    Article  Google Scholar 

  • Song, A., & Kim, Y. (2017). Deep learning-based hyperspectral image classification with application to environmental geographic information systems. Korean Journal of Remote Sensing, 33, 1061–1073

    Google Scholar 

  • Sonobe, R., Tani, H., Wang, X., Kobayashi, N., & Shimamura, H. (2014). Random forest classification of crop type using multi-temporal TerraSAR-X dual-polarimetric data. Remote Sensing Letters, 5(2), 157–164

    Article  Google Scholar 

  • Suh, H. K., Ijsselmuiden, J., Hofstee, J. W., & van Henten, E. J. (2018). Transfer learning for the classification of sugar beet and volunteer potato under field conditions. Biosystems Engineering, 174, 50–65

    Article  Google Scholar 

  • Sujaritha, M., Annadurai, S., Satheeshkumar, J., Sharan, S. K., & Mahesh, L. (2017). Weed detecting robot in sugarcane fields using fuzzy real time classifier. Computers and Electronics in Agriculture, 134, 160–171

    Article  Google Scholar 

  • Suzuki, K., Rin, U., Maeda, Y., & Takeda, H. (2018). Forest cover classification using geospatial multimodal DaTA. International Archives of the Photogrammetry, Remote Sensing & Spatial Information Sciences, 42(2), 1091–1096

    Article  Google Scholar 

  • Tao, Y., & Zhou, J. (2017). Automatic apple recognition based on the fusion of color and 3D feature for robotic fruit picking. Computers and Electronics in Agriculture, 142, 388–396

    Article  Google Scholar 

  • Tellaeche, A., Pajares, G., Burgos-Artizzu, X. P., & Ribeiro, A. (2011). A computer vision approach for weeds identification through Support Vector Machines. Applied Soft Computing, 11(1), 908–915

    Article  Google Scholar 

  • Thanh Noi, P., & Kappas, M. (2018). Comparison of random forest, k-nearest neighbor, and support vector machine classifiers for land cover classification using Sentinel-2 imagery. Sensors, 18(1), 18

    Google Scholar 

  • Ubbens, J. R., & Stavness, I. (2017). Deep plant phenomics: a deep learning platform for complex plant phenotyping tasks. Frontiers in plant science, 8, 1190

    Article  PubMed  PubMed Central  Google Scholar 

  • Virnodkar, S. S., Pachghare, V. K., Patil, V., & Jha, S. K. (2020). Remote sensing and machine learning for crop water stress determination in various crops: A critical review. Precision Agriculture, 21, 1121–1155

    Article  Google Scholar 

  • Wan, S., & Goudos, S. (2020). Faster R-CNN for multi-class fruit detection using a robotic vision system. Computer Networks, 168, 107036

    Article  Google Scholar 

  • Wang, A., Zhang, W., & Wei, X. (2019). A review on weed detection using ground-based machine vision and image processing techniques. Computers and Electronics in Agriculture, 158, 226–240

    Article  Google Scholar 

  • Wang, D., Vinson, R., Holmes, M., Seibel, G., Bechar, A., Nof, S., et al. (2019). Early detection of tomato spotted wilt virus by hyperspectral imaging and outlier removal auxiliary classifier generative adversarial nets (OR-AC-GAN). Scientific Reports, 9(1), 4377

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  • Wei, X., Jia, K., Lan, J., Li, Y., Zeng, Y., & Wang, C. (2014). Automatic method of fruit object extraction under complex agricultural background for vision system of fruit picking robot. Optik-International Journal for Light and Electron Optics, 125(19), 5684–5689

    Article  Google Scholar 

  • Weiss, U., Biber, P., Laible, S., Bohlmann, K., & Zell, A. (2010). Plant species classification using a 3D LIDAR sensor and machine learning. In Ninth International Conference on Machine Learning and Applications, 2010 (pp. 339-345). Washington, DC, USA: IEEE.

  • Williams, H. A., Jones, M. H., Nejati, M., Seabright, M. J., Bell, J., Penhall, N. D., et al. (2019). Robotic kiwifruit harvesting using machine vision, convolutional neural networks, and robotic arms. Biosystems Engineering, 181, 140–156

    Article  Google Scholar 

  • Wolfert, S., Ge, L., Verdouw, C., & Bogaardt, M.-J. (2017). Big data in smart farming: A review. Agricultural Systems, 153, 69–80

    Article  Google Scholar 

  • Wspanialy, P., & Moussa, M. (2016). Early powdery mildew detection system for application in greenhouse automation. Computers and Electronics in Agriculture, 127, 487–494

    Article  Google Scholar 

  • Wu, C., Zeng, R., Pan, J., Wang, C. C., & Liu, Y.-J. (2019). Plant phenotyping by deep-learning-based planner for multi-robots. IEEE Robotics and Automation Letters, 4(4), 3113–3120

    Article  Google Scholar 

  • Wu, J., Zhang, B., Zhou, J., Xiong, Y., Gu, B., & Yang, X. (2019). Automatic recognition of ripening tomatoes by combining multi-feature fusion with a bi-layer classification strategy for harvesting robots. Sensors, 19(3), 612

    Article  PubMed Central  Google Scholar 

  • Xie, B., Zhang, H. K., & Xue, J. (2019). Deep convolutional neural network for mapping smallholder agriculture using high spatial resolution satellite image. Sensors, 19(10), 2398

    Article  PubMed Central  Google Scholar 

  • Xie, H., Fan, Z., Li, W., Rong, Y., Xiao, Y., & Zhao, L. (2016). Tobacco plant recognizing and counting based on svm. In International Conference on Industrial Informatics-Computing Technology, Intelligent Technology, Industrial Information Integration (ICIICII), 2016 (pp. 109–113). Wuhan, China: IEEE.

  • Yahata, S., Onishi, T., Yamaguchi, K., Ozawa, S., Kitazono, J., Ohkawa, T., et al. (2017). A hybrid machine learning approach to automatic plant phenotyping for smart agriculture. In International Joint Conference on Neural Networks (IJCNN), 2017 (pp. 1787–1793). Anchorage, Alaska: IEEE.

  • Yamamoto, K., Guo, W., Yoshioka, Y., & Ninomiya, S. (2014). On plant detection of intact tomato fruits using image analysis and machine learning methods. Sensors, 14(7), 12191–12206

    Article  PubMed  PubMed Central  Google Scholar 

  • Ye, L., Gao, L., Marcos-Martinez, R., Mallants, D., & Bryan, B. A. (2019). Projecting Australia’s forest cover dynamics and exploring influential factors using deep learning. Environmental Modelling & Software, 119, 407–417

    Article  Google Scholar 

  • Yeshmukhametov, A., Koganezawa, K., Buribayev, Z., Amirgaliyev, Y., & Yamamoto, Y. (2019). Development of Continuum Robot Arm and Gripper for Harvesting Cherry Tomatoes.

  • Yu, Y., Zhang, K., Yang, L., & Zhang, D. (2019). Fruit detection for strawberry harvesting robot in non-structural environment based on Mask-RCNN. Computers and Electronics in Agriculture, 163, 104846

    Article  Google Scholar 

  • Zhang, C., Harrison, P. A., Pan, X., Li, H., Sargent, I., & Atkinson, P. M. (2020). Scale Sequence Joint Deep Learning (SS-JDL) for land use and land cover classification. Remote Sensing of Environment, 237, 111593

    Article  Google Scholar 

  • Zhang, C., & Kovacs, J. M. (2012). The application of small unmanned aerial systems for precision agriculture: A review. Precision Agriculture, 13(6), 693–712

    Article  CAS  Google Scholar 

  • Zhang, L., Gui, G., Khattak, A. M., Wang, M., Gao, W., & Jia, J. (2019). Multi-task cascaded convolutional networks based intelligent fruit detection for designing automated robot. IEEE Access, 7, 56028–56038

    Article  Google Scholar 

  • Zhang, L., Jia, J., Gui, G., Hao, X., Gao, W., & Wang, M. (2018). Deep learning based improved classification system for designing tomato harvesting robot. IEEE Access, 6, 67940–67950

    Article  Google Scholar 

  • Zhang, T., Huang, Z., You, W., Lin, J., Tang, X., & Huang, H. (2020). An autonomous fruit and vegetable harvester with a low-cost gripper using a 3D sesnsor. Sensors, 20(1), 93

    Article  Google Scholar 

  • Zhang, X., Qiao, Y., Meng, F., Fan, C., & Zhang, M. (2018). Identification of maize leaf diseases using improved deep convolutional neural networks. IEEE Access, 6, 30370–30377

    Article  Google Scholar 

  • Zhao, Y., Gong, L., Huang, Y., & Liu, C. (2016). A review of key techniques of vision-based control for harvesting robot. Computers and Electronics in Agriculture, 127, 311–323

    Article  Google Scholar 

  • Zhao, Y., Gong, L., Zhou, B., Huang, Y., & Liu, C. (2016). Detecting tomatoes in greenhouse scenes by combining AdaBoost classifier and colour analysis. Biosystems Engineering, 148, 127–137

    Article  Google Scholar 

  • Zheng, Y.-Y., Kong, J.-L., Jin, X.-B., Su, T.-L., Nie, M.-J., & Bai, Y.-T. (2018). Real-Time Vegetables Recognition System based on Deep Learning Network for Agricultural Robots. In Chinese Automation Congress (CAC), 2018 (pp. 2223–2228). Xi’an, China: IEEE.

  • Zujevs, A., Osadcuks, V., & Ahrendt, P. (2015). Trends in robotic sensor technologies for fruit harvesting: 2010–2015. Procedia Computer Science, 77, 227–233

    Article  Google Scholar 

Download references

Funding

This research was funded by the Ministry of Business, Innovation and Employment (MBIE), New Zealand, Science for Technological Innovation (SfTI) National Science Challenge.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Khalid Mahmood Arif.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

The original version of this article was revised: Figures 6, 7, 8 and 9 have been corrected.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Saleem, M.H., Potgieter, J. & Arif, K.M. Automation in Agriculture by Machine and Deep Learning Techniques: A Review of Recent Developments. Precision Agric 22, 2053–2091 (2021). https://doi.org/10.1007/s11119-021-09806-x

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11119-021-09806-x

Keywords

Navigation