Skip to main content

Advertisement

Log in

Efficient tomato harvesting robot based on image processing and deep learning

  • Published:
Precision Agriculture Aims and scope Submit manuscript

Abstract

Agricultural robots are rapidly becoming more advanced with the development of relevant technologies and in great demand to guarantee food supply. As such, they are slated to play an important role in precision agriculture. For tomato production, harvesting employs over 40% of the total workforce. Therefore, it is meaningful to develop a robot harvester to assist workers. The objective of this work is to understand the factors restricting the recognition accuracy using image processing and deep learning methods, and improve the performance of crop detection in agricultural complex environment. With the accurate recognition of the growing status and location of crops, temporal management of the crop and selective harvesting can be available, and issues caused by the growing shortage of agricultural labour can be alleviated. In this respect, this work integrates the classic image processing methods with the YOLOv5 (You only look once version 5) network to increase the accuracy and robustness of tomato and stem perception. As a consequence, an algorithm to estimate the degree of maturity of truss tomatoes (clusters of individual tomatoes) and an integrated method to locate stems based on the resultant experiments error of each individual method were proposed. Both indoor and real-filed tests were carried out using a robot harvester. The results proved the high accuracy of the proposed algorithms under varied illumination conditions, with an average deviation of 2 mm from the ground-truth. The robot can be guided to harvest truss tomatoes efficiently, with an average operating time of 9 s/cluster.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21

Similar content being viewed by others

References

  • Berenstein, R., & Edan, Y. (2017). Human-robot collaborative site-specifific sprayer. Journal of Field Robotics, 34(8), 1519–1530.

    Article  Google Scholar 

  • Chen, W., Lu, S., Liu, B., Li, G., & Qian, T. (2020). Detecting citrus in orchard environment by using improved yolov4. Scientific Programming, 2020, 1–13.

    CAS  Google Scholar 

  • Chen, X., Chaudhary, K., Tanaka, Y., Nagahama, K., Yaguchi, H., Okada, K., & Inaba, M. (2015) Reasoning-based vision recognition for agricultural humanoid robot toward tomato harvesting. In: 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp 6487–6494).

  • Christiaensen, L., Rutledge, Z., & Taylor, J. E. (2020). Viewpoint: The future of work in agri-food. Food Policy, 99, 101963.

    Article  Google Scholar 

  • Feng Q, Wang X, Wang G, Li Z (2015) Design and test of tomatoes harvesting robot. In: 2015 IEEE international conference on information and automation (pp 949–952).

  • Gravalos, I., Avgousti, A., Gialamas, T., Alfieris, N., & Paschalidis, G. (2019). A robotic irrigation system for urban gardening and agriculture. Journal of Agricultural Engineering, 50(4), 198–207.

    Article  Google Scholar 

  • Hess, W., Kohler, D., Rapp, H., & Andor, D. (2016). Real-time loop closure in 2D LIDAR SLAM. In: 2016 IEEE international conference on robotics and automation (ICRA) (pp. 1271–1278).

  • Jia, W., Mou, S., Wang, J., Liu, X., Zheng, Y., Lian, J., & Zhao, D. (2020). Fruit recognition based on pulse coupled neural network and genetic Elman algorithm application in apple harvesting robot. International Journal of Advanced Robotic Systems, 17(1), 1729881419897473. https://doi.org/10.1177/1729881419897473

    Article  Google Scholar 

  • Jocher, G., Stoken, A., Borovec, J., NanoCode012, ChristopherSTAN, Changyu, L., Laughing, Hogan, A., Tkianai, L., YxNONG, AlexWang1900, Diaconu, L., Marc, Wanghaoyang0106, Ml5ah, Doug, Hatovix, Poznanski, J., Lijun, Y., Rai, P., Ferriday, R., Sullivan, T., Xinyu, W., YuriRibeiro, Eduard Reñé Claramunt, E., R., Hopesala, Dave, P., & Chen, Y. (2020). Ultralytics/yolov5: V3.0.  https://doi.org/10.5281/zenodo.3983579

  • Kanagasingham, S., Ekpanyapong, M., & Chaihan, R. (2020). Integrating machine vision-based row guidance with gps and compass-based routing to achieve autonomous navigation for a rice field weeding robot. Precision Agriculture, 21(4), 1–25.

    Article  Google Scholar 

  • Lehnert, C., English, A., McCool, C., Tow, A. W., & Perez, T. (2017). Autonomous sweet pepper harvesting for protected cropping systems. IEEE Robotics and Automation Letters, 2(2), 872–879.

    Article  Google Scholar 

  • Lili, W., Bo, Z., Jinwei, F., Xiaoan, H., Shu, W., Yashuo, L., Zhou, Q., & Chongfeng, W. (2017). Development of a tomato harvesting robot used in greenhouse. International Journal of Agricultural and Biological Engineering, 10(4), 140–149.

    Article  Google Scholar 

  • Lin, T.-Y., Maire, M., Belongie, S. J., Hays, J., Perona, P., Ramanan, D., Dollár, P., & Zitnick, C., L. (2014). Microsoft COCO: Common objects in context. In: European conference on computer vision (pp. 740–755).

  • Liu, J., Pi, J., & Xia, L. (2020). A novel and high precision tomato maturity recognition algorithm based on multi-level deep residual network. Multimedia Tools and Applications, 79, 9403–9417.

    Article  Google Scholar 

  • Lv, J., Wang, Y., Ni, H., Wang, Q., Rong, H., Ma, Z., Yang, B., & Xu, L. (2019b). Method for discriminating of the shape of overlapped apple fruit images. Biosystems Engineering, 186, 118–129.

    Article  Google Scholar 

  • Lv, J., Wang, Y., Xu, L., Gu, Y., Zou, L., Yang, B., & Ma, Z. (2019a). A method to obtain the near-large fruit from apple image in orchard for single-arm apple harvesting robot. Scientia Horticulturae, 257, 108758.

    Article  Google Scholar 

  • Pršić, D., Nedić, N., & Stojanović, V. (2017). A nature inspired optimal control of pneumatic- driven parallel robot platform. Proceedings of the Institution of Mechanical Engineers, Part c: Journal of Mechanical Engineering Science, 231(1), 59.

    Google Scholar 

  • Raja, R., Nguyen, T. T., Slaughter, D. C., & Fennimore, S. A. (2020). Real-time weed-crop classification and localisation technique for robotic weed control in lettuce. Biosystems Engineering, 192, 257–274.

    Article  Google Scholar 

  • van Henten, E. J., Hemming, J., van Tuijl, B. A. J., Kornet, J. G., Meuleman, J., Bontsema, J., & van Os, E. A. (2002). An autonomous robot for harvesting cucumbers in greenhouses. Autonomous Robots, 13(3), 241–258.

    Article  Google Scholar 

  • Wang, C., Tang, Y., Zou, X., SiTu, W., & Feng, W. (2017). A robust fruit image segmentation algorithm against varying illumination for vision system of fruit harvesting robot. Optik, 131, 626–631.

    Article  Google Scholar 

  • Williams, H. A., Jones, M. H., Nejati, M., Seabright, M. J., Bell, J., Penhall, N. D., Barnett, J. J., Duke, M. D., Scarfe, A. J., Ahn, H. S., Lim, J., & MacDonald, B. A. (2019). Robotic kiwifruit harvesting using machine vision, convolutional neural networks, and robotic arms. Biosystems Engineering, 181, 140–156.

    Article  Google Scholar 

  • Xiao, B., Cao, L., Xu, S., & Liu, L. (2020). Robust tracking control of robot manipulators with actuator faults and joint velocity measurement uncertainty. IEEE-ASME Transactions on Mechatronics, 25(3), 1354–1365.

    Article  Google Scholar 

  • Xiong, Y., Ge, Y., & From, P. J. (2020). An obstacle separation method for robotic picking of fruits in clusters. Computers and Electronics in Agriculture, 175, 105397. https://doi.org/10.1016/j.compag.2020.105397

    Article  Google Scholar 

  • Xu, R., Lin, H., Lu, K., Cao, L., & Liu, Y. (2021). A forest fire detection system based on ensemble learning. Forests, 12(2), 217.

    Article  Google Scholar 

  • Xuan, G., Gao, C., Shao, Y., Zhang, M., Wang, Y., Zhong, J., Li, Q., & Peng, H. (2020). Apple detection in natural environment using deep learning algorithms. IEEE Access, 8, 216772–216780. https://doi.org/10.1109/ACCESS.2020.3040423

    Article  Google Scholar 

  • Yaguchi, H., Nagahama, K., Hasegawa, T., & Inaba, M. (2016). Development of an autonomous tomato harvesting robot with rotational plucking gripper. In: 2016 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp 652–657).

  • Yan, B., Fan, P., Lei, X., Liu, Z., & Yang, F. (2021). A real-time apple targets detection method for picking robot based on improved YOLOv5. Remote Sensing, 13(9), 1619.

    Article  Google Scholar 

  • Yoshida, T., Fukao, T., & Hasegawa, T. (2018). Fast detection of tomato peduncle using point cloud with a harvesting robot. Journal of Robotics and Mechatronics, 30(2), 180–186.

    Article  Google Scholar 

  • Yoshida, T., Fukao, T., & Hasegawa, T. (2019) A tomato recognition method for harvesting with robots using point clouds. In: 2019 IEEE/SICE International Symposium on System Integration (SII) (pp 456–461). https://doi.org/10.1109/SII.2019.8700358

  • Yu, Y., Zhang, K., Liu, H., Yang, L., & Zhang, D. (2020). Real-time visual localization of the picking points for a ridge-planting strawberry harvesting robot. IEEE Access, 8, 116556–116568. https://doi.org/10.1109/ACCESS.2020.3003034

    Article  Google Scholar 

  • Zhang, Z., & Heinemann, P. H. (2017). Economic analysis of a low-cost apple harvest-assist unit. HortTechnology, 27(2), 240–247.

    Article  Google Scholar 

  • Zhuang, J., Luo, S., Hou, C., Tang, Y., He, Y., & Xue, X. Y. (2018). Detection of orchard citrus fruits using a monocular machine vision-based method for automatic fruit picking applications. Computers and Electronics in Agriculture, 152, 64–73.

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank all the members in the Intelligent Equipment and Robotics Lab of Shanghai University for their kind support. This project is funded by Shanghai Agricultural and Rural Committee, and the Number is 202002080010F01467.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Teng Sun.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary file1 (MP4 4061 kb)

Supplementary file2 (MP4 37034 kb)

Supplementary file3 (MP4 23491 kb)

Supplementary file4 (MP4 203300 kb)

Supplementary file5 (TXT 1 kb)

Appendix

Appendix

figure a
figure b
figure c

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Miao, Z., Yu, X., Li, N. et al. Efficient tomato harvesting robot based on image processing and deep learning. Precision Agric 24, 254–287 (2023). https://doi.org/10.1007/s11119-022-09944-w

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11119-022-09944-w

Keywords

Navigation