Skip to main content

Advertisement

Log in

A hybrid representation of the environment to improve autonomous navigation of mobile robots in agriculture

  • Published:
Precision Agriculture Aims and scope Submit manuscript

Abstract

This paper considers the problem of autonomous navigation in agricultural fields. It proposes a localization and mapping framework based on semantic place classification and key location estimation, which together build a hybrid topological map. This map benefits from generic partitioning of the field, which contains a finite set of well-differentiated workspaces and, through a semantic analysis, it is possible to estimate in a probabilistic way the position (state) of a mobile system in the field. Moreover, this map integrates both metric (key locations) and semantic features (working areas). One of its advantages is that a full and precise map prior to navigation is not necessary. The identification of the key locations and working areas is carried out by a perception system based on 2D LIDAR and RGB cameras. Fusing these data with odometry allows the robot to be located in the topological map. The approach is assessed through off-line data recorded in real conditions in diverse fields during different seasons. It exploits a real-time object detector based on a convolutional neural network called you only look once, version 3, which has been trained to classify a considerable number of crops, including market-garden crops such as broccoli and cabbage, and to identify grapevine trunks. The results show the interest in the approach, which allows (i) obtaining a simple and easy-to-update map, (ii) avoiding the use of artificial landmarks, and thus (iii) improving the autonomy of agricultural robots.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  • Adão, T., Hruška, J., Pádua, L., Bessa, J., Peres, E., Morais, R., et al. (2017). Hyperspectral imaging: A review on UAV-based sensors, data processing and applications for agriculture and forestry. Remote Sensing, 9(11), 1110.

    Article  Google Scholar 

  • Ampatzidis, Y., De Bellis, L., & Luvisi, A. (2017). iPathology: Robotic applications and management of plants and plant diseases. Sustainability, 9(6), 1010.

    Article  Google Scholar 

  • Bechar, A., & Vigneault, C. (2016). Agricultural robots for field operations: Concepts and components. Biosystems Engineering, 149, 94–111.

    Article  Google Scholar 

  • Benesty, J., Chen, J., Huang, Y., & Cohen, I. (2009). Pearson correlation coefficient. In I. Cohen, Y. Huang, J. Chen, & J. Benesty (Eds.), Noise reduction in speech processing, springer topics in signal processing (pp. 1–4). Berlin, Heidelberg, Germany: Springer.

    Google Scholar 

  • Bergerman, M., Billingsley, J., Reid, J., & van Henten, E. (2016). Robotics in agriculture and forestry. In B. Siciliano & O. Khatib (Eds.), Springer handbook of robotics (pp. 1463–1492). Berlin Heidelberg, Germany: Springer.

    Chapter  Google Scholar 

  • Blanke, M., Blas, M. R., Hansen, S., Andersen, J. C., & Caponetti, F. (2012). Autonomous robot supervision using fault diagnosis and semantic mapping in an orchard. In G. G. Rigatos (Ed.), Fault diagnosis in robotic and industrial systems (pp. 1–22). California, USA: CreateSpace.

    Google Scholar 

  • Blok, P. M., van Boheemen, K., van Evert, F. K., IJsselmuiden, J., & Kim, G. H. (2019). Robot navigation in orchards with localization based on Particle filter and Kalman filter. Computers and Electronics in Agriculture, 157, 261–269.

    Article  Google Scholar 

  • Bochtis, D. D., Sørensen, C. G., & Vougioukas, S. G. (2010). Path planning for in-field navigation-aiding of service units. Computers and Electronics in Agriculture, 74(1), 80–90.

    Article  Google Scholar 

  • Chen, K. H., & Tsai, W. H. (2000). Vision-based obstacle detection and avoidance for autonomous land vehicle navigation in outdoor roads. Automation in Construction, 10(1), 1–25.

    Article  Google Scholar 

  • Cherubini, A., Spindler, F., & Chaumette, F. (2014). Autonomous visual navigation and laser-based moving obstacle avoidance. IEEE Transactions on Intelligent Transportation Systems, 15(5), 2101–2110.

    Article  Google Scholar 

  • Comba, L., Gay, P., Piccarolo, P., & Ricauda Aimonino, D. (2010). Robotics and automation for crop management: Trends and perspective. In Proceedings of the international conference on work safety and risk prevention in agro-food and forest systems (pp. 471–478). Ragusa, Italy: Ragusa SHWA.

  • Durand-Petiteville, A., Le Flecher, E., Cadenat, V., Sentenac, T., & Vougioukas, S. G. (2017). Design of a sensor-based controller performing u-turn to navigate in orchards. In Proceedings of the 14th international conference on informatics in control, automation and robotics (ICINCO). (Vol. 2, pp. 172–181). Setúbal, Portugal: SCITEPRESS.

  • Futterlieb, M., Cadenat, V., & Sentenac, T. (2014). A navigational framework combining visual servoing and spiral obstacle avoidance techniques. In Proceedings of the 11th international conference on informatics in control, automation and robotics (ICINCO) (Vol. 2, pp. 57–64). Setúbal, Portugal: SCITEPRESS.

  • García-Santillán, I., Guerrero, J. M., Montalvo, M., & Pajares, G. (2018). Curved and straight crop row detection by accumulation of green pixels from images in maize fields. Precision Agriculture, 19(1), 18–41.

    Article  Google Scholar 

  • Gonzalez-de-Santos, P., Ribeiro, A., Fernandez-Quintanilla, C., Lopez-Granados, F., Brandstoetter, M., Tomic, S., et al. (2017). Fleets of robots for environmentally-safe pest control in agriculture. Precision Agriculture, 18(4), 574–614.

  • Hague, T., Marchant, J. A., & Tillett, N. D. (2000). Ground based sensing systems for autonomous agricultural vehicles. Computers and Electronics in Agriculture, 25(1–2), 11–28.

  • Hamuda, E., Glavin, M., & Jones, E. (2016). A survey of image processing techniques for plant extraction and segmentation in the field. Computers and Electronics in Agriculture, 125, 184–199.

    Article  Google Scholar 

  • IFV. (2020). Institut Francais de la Vigne et du Vin (French Institute of Vine and Wine). Retrieved September 23, 2020 from https://www.vignevin.com/.

  • Kanagasingham, S., Ekpanyapong, M., & Chaihan, R. (2020). Integrating machine vision-based row guidance with GPS and compass-based routing to achieve autonomous navigation for a rice field weeding robot. Precision Agriculture, 21, 831–855.

    Article  Google Scholar 

  • Kayacan, E., Kayacan, E., Ramon, H., & Saeys, W. (2015). Towards agrobots: Identification of the yaw dynamics and trajectory tracking of an autonomous tractor. Computers and Electronics in Agriculture, 115, 78–87.

    Article  Google Scholar 

  • Keskin, M., Sekerli, Y. E., & Kahraman, S. (2017). Performance of two low-cost GPS receivers for ground speed measurement under varying speed conditions. Precision Agriculture, 18(2), 264–277.

    Article  Google Scholar 

  • Kostavelis, I., & Gasteratos, A. (2015). Semantic mapping for mobile robotics tasks: A survey. Robotics and Autonomous Systems, 66, 86–103.

    Article  Google Scholar 

  • Kuipers, B., & Byun, Y. T. (1991). A robot exploration and mapping strategy based on a semantic hierarchy of spatial representations. Robotics and Autonomous Systems, 8(1–2), 47–63.

    Article  Google Scholar 

  • Lee, S. H., Chan, C. S., Mayo, S. J., & Remagnino, P. (2017). How deep learning extracts and learns leaf features for plant classification. Pattern Recognition, 71, 1–13.

    Article  Google Scholar 

  • Li, M., Imou, K., Wakabayashi, K., Tani, S., & Yokoyama, S. (2010). Position estimation method using artificial landmarks and omnidirectional vision. Transactions of the ASABE, 53(1), 297–303.

    Article  Google Scholar 

  • Lin, T. Y., Goyal, P., Girshick, R., He, K., & Dollár, P. (2017). Focal loss for dense object detection. In Proceedings of the IEEE international conference on computer vision (pp. 2980–2988). New York, USA: IEEE.

  • Lowry, S., Sünderhauf, N., Newman, P., Leonard, J. J., Cox, D., Corke, P., et al. (2015). Visual place recognition: A survey. IEEE Transactions on Robotics, 32(1), 1–19.

    Article  Google Scholar 

  • Malavazi, F. B., Guyonneau, R., Fasquel, J. B., Lagrange, S., & Mercier, F. (2018). LiDAR-only based navigation algorithm for an autonomous agricultural robot. Computers and Electronics in Agriculture, 154, 71–79.

    Article  Google Scholar 

  • Naïo. (2020). Robots agricoles Naïo Technologies (Agricultural robots Naïo Technologies). Retrieved November 23, 2020, from https://www.naio-technologies.com/.

  • Penizzotto, F., Slawinski, E., & Mut, V. (2015). Laser radar based autonomous mobile robot guidance system for olive groves navigation. IEEE Latin America Transactions, 13(5), 1303–1312.

    Article  Google Scholar 

  • Potena, C., Nardi, D., & Pretto, A. (2016). Fast and accurate crop and weed identification with summarized train sets for precision agriculture. In Proceedings of the 14th international conference on intelligent autonomous systems (pp. 105–121). Cham, Switzerland: Springer.

  • Redmon, J., & Farhadi, A. (2018). Yolov3: An incremental improvement. Non-peer reviewed preprint at arXiv:1804.02767.

  • Rottmann, A., Mozos, Ó. M., Stachniss, C., & Burgard, W. (2005). Semantic place classification of indoor environments with mobile robots using boosting. In Proceedings of the national conference on artificial intelligence (AAAI) (pp. 1306–1311). California, USA: The AAAI Press.

  • Royakkers, L., & van Est, R. (2015). A literature review on new robotics: Automation from love to war. International Journal of Social Robotics, 7(5), 549–570.

    Article  Google Scholar 

  • Schubert, E., Sander, J., Ester, M., Kriegel, H. P., & Xu, X. (2017). DBSCAN revisited, revisited: Why and how you should (still) use DBSCAN. ACM Transactions on Database Systems (TODS), 42(3), 19.

    Article  Google Scholar 

  • Sharifi, M., & Chen, X. (2015). A novel vision based row guidance approach for navigation of agricultural mobile robots in orchards. In Proceedings of the 6th international conference on automation, robotics and applications (ICARA) (pp. 251–255). New York, USA: IEEE.

  • Shamshiri, R. R., Weltzien, C., Hameed, I. A., Yule, I. J., Grift, T. E., Balasundram, S. K., et al. (2018). Research and development in agricultural robotics: A perspective of digital farming. International Journal of Agricultural and Biological Engineering, 11, 1–14.

    Google Scholar 

  • Shi, W., van de Zedde, R., Jiang, H., & Kootstra, G. (2019). Plant-part segmentation using deep learning and multi-view vision. Biosystems Engineering, 187, 81–95.

    Article  Google Scholar 

  • STERELA. (2020). Société industrielle d’ingénierie et services (Industrial engineering and services company). Retrieved September 23, 2020 from http://www.sterela.fr/.

  • Thanpattranon, P., Ahamed, T., & Takigawa, T. (2016). Navigation of autonomous tractor for orchards and plantations using a laser range finder: Automatic control of trailer position with tractor. Biosystems Engineering, 147, 90–103.

    Article  Google Scholar 

  • Thrun, S. (1998). Learning metric-topological maps for indoor mobile robot navigation. Artificial Intelligence, 99(1), 21–71.

    Article  Google Scholar 

  • Tu, C., Van Wyk, B. J., Djouani, K., Hamam, Y., & Du, S. (2014). An efficient crop row detection method for agriculture robots. In Proceedings of the 7th international congress on image and signal processing (pp. 655–659). New York, USA: IEEE

  • Vougioukas, S. G. (2019). Agricultural robotics. Annual Review of Control, Robotics, and Autonomous Systems, 2, 365–392.

    Article  Google Scholar 

  • Wäldchen, J., & Mäder, P. (2018). Plant species identification using computer vision techniques: A systematic literature review. Archives of Computational Methods in Engineering, 25, 507–543.

    Article  Google Scholar 

  • Weiss, U., & Biber, P. (2010). Semantic place classification and mapping for autonomous agricultural robots. In Proceedings of IROS workshop on semantic mapping and autonomous, knowledge acquisition. New York, USA: IEEE.

  • Xue, J., & Su, B. (2017). Significant remote sensing vegetation indices: A review of developments and applications. Journal of Sensors, 2017(1), 1–17.

    Article  CAS  Google Scholar 

  • Zhang, J., Maeta, S., Bergerman, M., & Singh, S. (2014). Mapping orchards for autonomous navigation. Paper no. 141838567, St Joseph, MI, USA: ASABE.

    Google Scholar 

  • Zhao, Z.-Q., Zheng, P., Xu, S., & Wu, X. (2019). Object detection with deep learning: A review. IEEE Transactions on Neural Networks and Learning Systems, 30, 3212–3232.

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank colleagues from Naïo Technologies, Toulouse, for their participation in this work, through the collaborative project DESHERB’EUR funded by the program «Investment for the Future» of the French government. The authors would also like to thank the French Institute of Vine and Wine (Institut Français de la Vigne et du Vin), for allowing the use of their experimental fields for experimental tests.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to L. Emmi or V. Cadenat.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Emmi, L., Le Flécher, E., Cadenat, V. et al. A hybrid representation of the environment to improve autonomous navigation of mobile robots in agriculture. Precision Agric 22, 524–549 (2021). https://doi.org/10.1007/s11119-020-09773-9

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11119-020-09773-9

Keywords

Navigation