Skip to main content

Advertisement

Log in

Improved localization in a corn crop row using a rotated laser rangefinder for three-dimensional data acquisition

  • Technical Paper
  • Published:
Journal of the Brazilian Society of Mechanical Sciences and Engineering Aims and scope Submit manuscript

Abstract

Small robotic vehicles have been navigating agricultural fields in the pursuit of new possibilities to increase agricultural production and to meet the increasing food and energetic demands. However, a perception system with reliable awareness of the surroundings remains a challenge to achieve autonomous navigation. Camera and single-layer laser scanners have been the primary sources of information, yet the first suffers from outdoor light sensibility and both from occlusion by leaves. This paper describes a three-dimensional system acquisition for corn crops. The sensing core is a single-layer UTM30-LX laser scanner rotating around its axis, while an inertial sensor provides angular measurements. With the rotation, multiple layers are used to compose a 3D point cloud, which is represented by a two-dimensional occupancy grid. Each cell is filled according to the number of readings, and their weights derive from two procedures: firstly, a mask enhances vertical entities (stalks); secondly, two Gaussian functions on the expected position of the immediate neighboring rows weaken readings in the middle of the lane and farther rows. The resulting occupancy grid allows the representation of the cornrows by virtual walls, which are used as references to a wall follower algorithm. According to experimental results, the virtual walls are segmented with reduced influence from straying leaves and sparse weeds when compared to the segmentation done with single-layer laser scanner data. Indeed, 64.02% of 3D outputs are within 0.05 m limit error from expected lane width, while only 11.63% of single-layer laser data are within same limit.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Andújar D, Rueda-Ayala V, Moreno H, Rosell-Polo JR, Escolà A, Valero C, Gerhards R, Fernández-Quintanilla C, Dorado J, Griepentrog HW (2013) Discriminating crop, weeds and soil surface with a terrestrial LIDAR sensor. Sensors (Switzerland) 13(11):14662–14675. https://doi.org/10.3390/s131114662

    Article  Google Scholar 

  2. Auat Cheein FA, Carelli R (2013) Agricultural robotics: unmanned robotic service units in agricultural tasks. IEEE Ind Electron Mag 7(3):48–58. https://doi.org/10.1109/MIE.2013.2252957

    Article  Google Scholar 

  3. Bell J, MacDonald BA, Ahn HS (2016) Row following in pergola structured orchards. In: IEEE international conference on intelligent robots and systems 2016-Nov, pp 640–645. https://doi.org/10.1109/IROS.2016.7759120

  4. Benet B, Lenain R, Rousseau V (2017) Development of a sensor fusion method for crop row tracking operations. Adv Anim Biosci 8(2):583–589. https://doi.org/10.1017/s2040470017000310

    Article  Google Scholar 

  5. Blackmore S (2016) Towards robotic agriculture. In: Society of photo-optical instrumentation engineers (SPIE), vol 9866, June 2016. https://doi.org/10.1117/12.2234051, http://proceedings.spiedigitallibrary.org/proceeding.aspx?doi=10.1117/12.2234051

  6. Dzitac P, Mazid AM (2012) A depth sensor to control pick-and-place robots for fruit packaging. In: 2012 12th international conference on control, automation, robotics and vision, ICARCV 2012 2012 December, pp 949–954. https://doi.org/10.1109/ICARCV.2012.6485285

  7. English A, Ross P, Ball D, Corke P (2014) Vision based guidance for robot navigation in agriculture. In: Proceedings—IEEE international conference on robotics and automation, pp 1693–1698. https://doi.org/10.1109/ICRA.2014.6907079

  8. Escolà A, Martínez-Casasnovas JA, Rufat J, Arnó J, Arbonés A, Sebé F, Pascual M, Gregorio E, Rosell-Polo JR (2017) Mobile terrestrial laser scanner applications in precision fruticulture/horticulture and tools to extract information from canopy point clouds. Precis Agric 18(1):111–132. https://doi.org/10.1007/s11119-016-9474-5

    Article  Google Scholar 

  9. García-Santillán ID, Montalvo M, Guerrero JM, Pajares G (2017) Automatic detection of curved and straight crop rows from images in maize fields. Biosyst Eng 156:61–79. https://doi.org/10.1016/j.biosystemseng.2017.01.013

    Article  Google Scholar 

  10. Garrido M, Paraforos DS, Reiser D, Arellano MV, Griepentrog HW, Valero C (2015) 3D maize plant reconstruction based on georeferenced overlapping lidar point clouds. Remote Sens 7(12):17077–17096. https://doi.org/10.3390/rs71215870

    Article  Google Scholar 

  11. Heraud JA, Lange AF (2009) Agricultural automatic vehicle guidance from horses to GPS: How we got here, and where we are going. In: ASABE distinguished lecture series—agricultural automatic vehicle guidance from horses to GPS: how we got here, and where we are going, vol 913, pp 1–67

  12. Hiremath SA, Heijden GWAMVD, Evert FKV, Stein A, Braak CJFT (2014) Laser range finder model for autonomous navigation of a robot in a maize field using a particle filter. Comput Electron Agric 100:41–50. https://doi.org/10.1016/j.compag.2013.10.005

    Article  Google Scholar 

  13. Nieuwenhuisen M, Droeschel D, Beul M, Behnke S (2016) Autonomous navigation for micro aerial vehicles in complex GNSS-denied environments. J Intell Robot Syst Theory Appl 84(1–4):199–216. https://doi.org/10.1007/s10846-015-0274-3

    Article  Google Scholar 

  14. Paulus S, Schumann H, Kuhlmann H, Léon J (2014) High-precision laser scanning system for capturing 3D plant architecture and analysing growth of cereal plants. Biosyst Eng 121:1–11. https://doi.org/10.1016/j.biosystemseng.2014.01.010

    Article  Google Scholar 

  15. Pedersen SM, Fountas S, Have H, Blackmore BS (2006) Agricultural robots-system analysis and economic feasibility. Precis Agric 7:295–308. https://doi.org/10.1007/s11119-006-9014-9

    Article  Google Scholar 

  16. Reina G, Milella A, Rouveure R, Nielsen M, Worst R, Blas MR (2016) Ambient awareness for agricultural robotic vehicles. Biosyst Eng 146:114–132. https://doi.org/10.1016/j.biosystemseng.2015.12.010

    Article  Google Scholar 

  17. Reiser D, Vázquez-Arellano M, Izard MG, Paraforos DS, Sharipov G, Griepentrog HW (2017) Clustering of laser scanner perception points of maize plants. Adv Anim Biosci 8(02):204–209. https://doi.org/10.1017/S204047001700111X

    Article  Google Scholar 

  18. Roh HC, Sung CH, Chung MJ (2013) Rapid SLAM using simple map representation in indoor environment. In: FCV 2013—proceedings of the 19th Korea–Japan joint workshop on frontiers of computer vision, pp 225–229. https://doi.org/10.1109/FCV.2013.6485492

  19. Torgersen J (2014) Mobile agricultural robot independent four wheel Ackerman steering. PhD thesis, Norwegian University of Life Sciences

  20. Underwood J, Wendel A, Schofield B, McMurray L, Kimber R (2017) Efficient in-field plant phenomics for row-crops with an autonomous ground vehicle. J Field Robot 34(6):1061–1083. https://doi.org/10.1002/rob.21728

    Article  Google Scholar 

  21. Velasquez AEB, Higuti VAH, Guerrero HB, Milori DMBP, Magalhaes DV, Becker M (2016) Helvis—a small-scale agricultural mobile robot prototype for precision agriculture. In: 13th international conference on precision agriculture, August, pp 1–17

  22. Weiss U, Biber P (2011) Plant detection and mapping for agricultural robots using a 3D LIDAR sensor. Robot Auton Syst 59(5):265–273. https://doi.org/10.1016/j.robot.2011.02.011

    Article  Google Scholar 

  23. Wellington C, Campoy J, Khot L, Ehsani R (2012) Orchard tree modelling for advanced sprayer control and automatic tree inventory. In: IEEE/RSJ international conference on intelligent robots and systems (IROS) workshop on agricultural robotics, pp 5–6. https://doi.org/10.3390/s140100691

  24. Zhang J, Chambers A, Maeta S, Bergerman M, Singh S (2013) 3D perception for accurate row following: methodology and results. In: IEEE international conference on intelligent robots and systems, pp 5306–5313. https://doi.org/10.1109/IROS.2013.6697124

Download references

Acknowledgements

The authors would like to thank São Paulo Research Foundation (FAPESP) grant #2017/10401-3 and #2016/09970-0, EMBRAPA and EESC-USP, for their support in this work. This study was financed in part by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - Brasil (CAPES) - Finance Code 001.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mateus V. Gasparino.

Additional information

Technical Editor: Victor Juliano De Negri.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gasparino, M.V., Higuti, V.A.H., Velasquez, A.E.B. et al. Improved localization in a corn crop row using a rotated laser rangefinder for three-dimensional data acquisition. J Braz. Soc. Mech. Sci. Eng. 42, 592 (2020). https://doi.org/10.1007/s40430-020-02673-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40430-020-02673-z

Keywords

Navigation