Skip to main content
Log in

Multi-Matching-Based Vision Navigation Referencing Map Tile

  • Original Paper
  • Published:
International Journal of Aeronautical and Space Sciences Aims and scope Submit manuscript

Abstract

Modern unmanned aerial vehicles (UAVs), including combat drones, are usually equipped with cameras for imaging. A camera is mainly used to increase the accuracy of the attack by ensuring stable-guided flight toward the target when the UAV detects the target in the terminal phase. Despite the high cost of these pieces of equipment, their utilization is in fact low, except during the final stage. The method proposed here extends the range of use of a UAV-mounted camera to the navigation stage. When GPS is unable to estimate the position due to jamming or a sensor error, the possibility of using the position estimation value calculated through vision navigation is described.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26
Fig. 27
Fig. 28
Fig. 29

Similar content being viewed by others

References

  1. King AD (1998) Inertial navigation—forty years of evolution. GEC Rev 13(3):140–149

    Google Scholar 

  2. Parkinson BW, Enge P, Axelrad P, Spilker JJ Jr (eds) (1996) Global positioning system: theory and applications, vol II. American Institute of Aeronautics and Astronautics, Reston

    Google Scholar 

  3. Grewal MS, Weill LR, Andrews AP (2007) Global positioning systems, inertial navigation, and integration. Wiley, New York

    Book  Google Scholar 

  4. Newcome LR (2012) Unmanned aviation: a brief history of unmanned aerial vehicles. AIAA, Reston

    Google Scholar 

  5. Sun Y, Li H, Sun L (2017) Use of satellite image for constructing the unmanned aerial vehicle image matching framework. J Appl Remote Sens 11(1):016023

    Article  MathSciNet  Google Scholar 

  6. Zhao C, Zhao H, Lv J, Sun S, Li B (2016) Multimodal image matching based on multimodality robust line segment descriptor. Neurocomputing 177:290–303

    Article  Google Scholar 

  7. Costea D, Leordeanu M (2016) Aerial Image geolocalization from the recognition and matching of roads and intersections. https://arxiv.org/abs/quant-ph/1605.08323

  8. Wang W, Yang N, Zhang H (2016) A review of road extraction from remote sensing images. J Traffic Transp Eng 3:271–282

    Google Scholar 

  9. Chen M, Habib A, He H, Zhu Q, Zhang W (2017) Robust feature matching method for SAR and optical images by using Gaussian-gamma-shaped bi-windows-based descriptor and geometric constraint. Remote Sens 9(9):882

    Article  Google Scholar 

  10. Zhuo X, Koch T, Kurz F, Fraundorfer F, Reinartz P (2017) Automatic UAV image geo-registration by matching UAV images to georeferenced image data. Remote Sens 9(4):376

    Article  Google Scholar 

  11. Conte G, Doherty P (2009) Vision-based unmanned aerial vehicle navigation using geo-referenced information. EURASIP J Adv Signal Process 2019:1–18

    MATH  Google Scholar 

  12. Jin Z, Wang X, Moran B (2016) Multi-region scene matching based localisation for autonomous vision navigation of UAVs. J Navig 69:1215–1233

    Article  Google Scholar 

  13. Electronic image stabilization, TDK, last modified 15 Feb 2021. https://invensense.tdk.com/solutions/electronic-image-stabilization/. Accessed 16 Feb 2021

  14. stabilizing gimbals & stabilized camera mounts for drones & UAVs, gremsy, last modified Feb 15, 2021. https://www.unmannedsystemstechnology.com/company/gremsy/. Accessed 16 Feb 2021

  15. Vincenty T (1975) Direct and inverse solutions of geodesics on the ellipsoid with application of nested equations. Surv Rev 22(176):88–93

    Article  Google Scholar 

  16. Vincenty T (1975) Geodetic inverse solution between antipodal points. DMAAC Geodetic Survey Squadron

  17. Lowe GD (2004) Distinctive image features from scale-invariant keypoints. Int J Comput Vis 60(2):91–100

    Article  Google Scholar 

  18. Oliveira I, Todt E, Fonseca K (2015) IGFTT: towards an efficient alternative to SIFT and SURF. In: WSCG

  19. Zacarkim VL, Todt E, Bombardelli FG (2018) Evaluation of IGFTT keypoints detector in indoor visual SLAM. In: Latin American robotic symposium

  20. Tareen SAK, Saleem Z (2018) A comparative analysis of SIFT, SURF, KAZE, AKAZE, ORB, and BRISK. In: iCoMET

  21. Isik S (2014) A comparative evaluation of well-known feature detectors and descriptors. Int J Appl Math Electron Comput 3(1):1–6

  22. “Terminology: AGL”, FliteLab, last modified 15 Sep 2020. https://blog.flitelab.com/2014/12/09/terminology-agl/. Accessed 24 Sept 2020

  23. Maini R, Aggarwal H (2009) Study and comparison of various image edge-detection techniques. Int Image Process (IJIP) 3(1):1–11

    Google Scholar 

  24. Hough PV (1959) Machine analysis of bubble chamber pictures. In: International conference on high energy accelerators and instrumentation. CERN

  25. Hausdorff F (1915) Grundzüge der Mengenlehre. Monatshefte Math 26(1):A34–A35

    Google Scholar 

  26. Dubuisson MP, Jain AK (1994) A modified Hausdorff distance for object matching. In: 12th international conference on pattern recognition, vol 1, pp 566–568

  27. Yang C, Medioni G (1991) Object modelling by registration of multiple range images. Image Vis Comput 10(3):145–155

    Google Scholar 

  28. Besl PJ, McKay ND (1992) A method for the registration of 3-D shapes. IEEE Trans Pattern Anal Mach Intell 14(2):239–256

    Article  Google Scholar 

  29. “Le meraviglie di Google”, TEKL Agenzia di Comunicazione, last modified 10 Sep 2020. https://www.tekla.tv/testserver6/2016/11/23/le-meraviglie-di-google/. Accessed 29 Sept 2020

  30. “An iPhone's journey”, Ecormany, last modified 29 Jul 2020. http://www.ecormany.com/blog/tag/google+earth. Accessed 29 Sept 2020

Download references

Acknowledgments

This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government(MSIT) (No. 2020R1A2C1011745). This research was supported by Unmanned Vehicles Advanced Core Technology Research and Development Program through the National Research Foundation of Korea (NRF), Unmanned Vehicle Advanced Research Center(UVARC) funded by the Ministry of Science and ICT, the Republic of Korea (No. 2020M3C1C1A01086407).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Seokhyun Shin.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shin, S., Min, D. & Lee, J. Multi-Matching-Based Vision Navigation Referencing Map Tile. Int. J. Aeronaut. Space Sci. 22, 1119–1140 (2021). https://doi.org/10.1007/s42405-021-00373-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s42405-021-00373-z

Keywords

Navigation