Skip to main content
Log in

Lidar-aided Autonomous Landing and Vision-based Taxiing for Fixed-Wing UAV

  • Research Article
  • Published:
Journal of the Indian Society of Remote Sensing Aims and scope Submit manuscript

Abstract

Autonomous UAV technology is limited in its ability to land safely at distinct airfields that have not been precisely surveyed and where GPS is unavailable. In this paper, we present a multi-sensor system for the automatic landing of fixed-wing UAV. The system is composed of a high precision aircraft controller, a range finder (Lidar) and a vision module used for detection and tracking of runways. The estimation of the position of the fixed-wing UAV is by using Lidar and performs gliding till flaring. Then, a federated extended Kalman filter (EKF) structure is costumed and utilizes the solutions of the IMU, GPS and Lidar as independent measurements to estimate the position of the vehicle. The framework can be used to integrate the vision solutions and enables the estimation to be smooth and robust landing. For taxiing, the neural network is used such that from live video stream from the camera trains the UAV to land precisely along the runway.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7 
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24

Similar content being viewed by others

References

  • Abu-Jbara, K., Alheadary, W., Sundaramorthi, G., & Claudel, C. (2015, June). A robust vision-based runway detection and tracking algorithm for automatic UAV landing. In 2015 International conference on unmanned aircraft systems (ICUAS) (pp. 1148–1157). IEEE.

  • Burlion, L., & de Plinval, H. (June 2016). Toward vision based landing of a fixed-wing UAV on an unknown runway under some fov constraints. In IEEE 2017 International Conference on Unmanned Aircraft Systems (ICUAS), June 13–16, 2017 (pp. 1824–1832).

  • Chandra, S., & Chapman, R., DiVerdi, R. (2016). Protocol for autonomous landing of unmanned air vehicles on research vessels, 978-1-5090-1537-5/16/$31.00 ©2016 IEEE (pp. 1–5).

  • Chiu, K. Y., & Lin, S. F. (2005, June). Lane detection using color-based segmentation. In IEEE Proceedings. Intelligent Vehicles Symposium, 2005. (pp. 706–711). IEEE.

  • Kim, J., Woo, S., & Kim, J. (2017, June). Lidar-guided autonomous landing of an aerial vehicle on a ground vehicle. In 2017 14th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI) (pp. 228–231). IEEE.

  • Kong, W., Zhang, D., & Zhang, J. (2015, December). A ground-based multi-sensor system for autonomous landing of a fixed wing UAV. In 2015 IEEE International Conference on Robotics and Biomimetics (ROBIO) (pp. 1303–1310). IEEE.

  • Laiacker, M., Kondak, K., Schwarzbach, M., & Muskardin, T. (2013, November). Vision aided automatic landing system for fixed wing UAV. In 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 2971–2976). IEEE.

  • Lee, C. H., Lee, H. H., & Lee, M. J. (2016). Withdrawn article: Runway detection based on geometric constraints of runways for safe landing. In MATEC Web of Conferences (Vol. 54, p. 08005). EDP Sciences

  • Liu, C., & Zhang, Y. (2016). Runway detection during approach and landing based on image fusion. Journal of Residuals Science & Technology, 13(7), 195–201.

    Google Scholar 

  • Lu, B., Wu, X., Figueroa, H., & Monti, A. (2007). A low-cost real-time hardware-in-the-loop testing approach of power electronics controls. IEEE Transactions on Industrial Electronics, 54(2), 919–931.

    Article  Google Scholar 

  • McCarthy, T. B. (2017). Feasibility study of a vision-based landing system for unmanned fixed-wing aircraft. Thesis at Naval Postgraduate School Monterey United States.

  • Neven, D., De Brabandere, B., Georgoulis, S., Proesmans, M., & Van Gool, L. (2018, June). Towards end-to-end lane detection: an instance segmentation approach. In 2018 IEEE intelligent vehicles symposium (IV) (pp. 286–291). IEEE.

  • Shang, J., & Shi, Z. (2007). Vision-based runway recognition for uav autonomous landing. International Journal of Computer Science and Network Security, 7(3), 112–117.

    Google Scholar 

  • Wang, Z., Ren, W., & Qiu, Q. (2018). Lanenet: Real-time lane detection networks for autonomous driving. arXiv preprint arXiv:1807.01726

  • Williams, P., & Crump, M. (2012, September). Intelligent landing system for landing uavs at unsurveyed airfields. In 28th International Congress of the Aeronautical Sciences (pp. 1–19).

  • Wu, P. C., Chang, C. Y., & Lin, C. H. (2014). Lane-mark extraction for automobiles under complex conditions. Pattern Recognition, 47(8), 2756–2767.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to K. Senthil Kumar.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kumar, K.S., Venkatesan, M. & Karuppaswamy, S. Lidar-aided Autonomous Landing and Vision-based Taxiing for Fixed-Wing UAV. J Indian Soc Remote Sens 49, 629–640 (2021). https://doi.org/10.1007/s12524-020-01238-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12524-020-01238-w

Keywords

Navigation