Skip to main content
Log in

Integration of Synced Side-Scan Sonar and Video Data for Seafloor Investigations

  • INSTRUMENTS AND METHODS
  • Published:
Oceanology Aims and scope

Abstract

For the visual study of the seafloor and underwater objects and to improve the efficiency of interpreting underwater video data, in some cases it is appropriate to augment it with side-scan sonar data. This article proposes a method for integrating sonar and video data, making it possible to assess the state of underwater objects on the meso- and microscale. Our method is based on creating panoramic images from video data, then combining them with side-scan sonograms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.
Fig. 6.
Fig. 7.
Fig. 8.

Similar content being viewed by others

REFERENCES

  1. A. K. Zalota, V. A. Spiridonov, S. Galkin, and A. A. Pronin, “Population structure of alien snow crabs (Chionoecetes opilio) in the Kara Sea (trawl and video sampling),” Oceanology 60, 83–88 (2020).

    Article  Google Scholar 

  2. A. V. Nikonov, R. V. Davletshin, N. I. Yakovleva, and P. S. Lazarev, “Savitzky-Golay smoothing method of FPA photodiodes spectral response,” Usp. Prikl. Fiz. 4 (2), 198–205 (2016).

    Google Scholar 

  3. A. A. Pronin, “Collection and representation data video movies of bottom surface in ocenological investigations with underwater towing equipment,” Mezhdunar. Zh. Prikl. Fundam. Issled., No. 12-1, 142–147 (2017).

  4. N. A. Rimskii-Korsakov, O. L. Kuznetsov, and A. A. Pronin, “Interpretation of sonar images of potentially dangerous underwater objects in the Kara Sea,” in Proceedings of the All-Russian Conf. “Applied Technologies of Hydroacoustics and Hydrophysics” (Moscow, 2016), No. 13, pp. 410–413.

  5. N. A. Rimsky-Korsakov, M. V. Flint, S. G. Pojarkov, et al., “Development of technology for integrated instrumental underwater observations related to Russian Arctic ecosystems,” Oceanology (Engl. Transl.) 59, 612–615 (2019).

  6. Yu. G. Firsov, Fundamentals of Hydroacoustics and the Use of Hydrographic Sonars (Nestor-Istoriya, St. Petersburg, 2010) [in Russian].

    Google Scholar 

  7. M. V. Flint, S. G. Poyarkov, and N. A. Rimsky-Korsakov, “Ecosystems of the Russian Arctic-2015 (63rd Cruise of the research vessel Akademik Mstislav Keldysh),” Oceanology (Engl. Transl.) 56, 459–461 (2017).

  8. M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM 24 (6), 381–395 (1981).

    Article  Google Scholar 

  9. M. Irani, P. Anandan, and S. Hsu, “Mosaic based representations of video sequences and their applications,” in Proceedings of IEEE International Conference on Computer Vision (Institute of Electrical and Electronics Engineers, Piscataway, NJ, 1995), pp. 605–611.

  10. D. O. B. Jones, B. J. Bett, R. B. Wynn, and D. G. Masson, “The use of towed camera platforms in deep-water science,” Underwater Technol. 28 (2), 41–50 (2009).

    Article  Google Scholar 

  11. J. Kopf, B. Chen, R. Szeliski, and M. Cohen, “Street slide: browsing street level imagery,” ACM Trans. Graph. 29 (4), 1–8 (2010).

    Google Scholar 

  12. Y. Marcon, “LAPMv2: An improved tool for underwater large-area photo-mosaicking,” in Proceedings of the OCEANS’14 MTS/IEEE St. John’s Oceans (Curran Associates, Red Hook, NY, 2014), pp. 1–10.

  13. O. Pizarro and H. Singh, “Toward large-area mosaicing for underwater scientific applications,” IEEE J. Ocean. Eng. 28 (4), 651–672 (2003).

    Article  Google Scholar 

  14. A. Rav-Acha, G. Engel, and S. Peleg, “Minimal Aspect Distortion (MAD) mosaicing of long scenes,” Int. J. Comput. Vision 78 (2–3), 187–206 (2008).

    Article  Google Scholar 

  15. C. N. Rooper and M. Zimmermann, “A bottom-up methodology for integrating underwater video and acoustic mapping for seafloor substrate classification,” Cont. Shelf Res. 27 (7), 947–957 (2007).

    Article  Google Scholar 

  16. R. Szeliski, Computer Vision: Algorithms and Applications (Springer-Verlag, New York, 2011).

    Book  Google Scholar 

  17. E. Williams, Aviation Formulary V1.46 (Ed. Williams, 2011), Vol. 1.

  18. E. Zheng, R. Raguram, P. Fite-Georgel, and J.-M. Frahm, “Efficient generation of multi-perspective panoramas,” in Proceedings of the 2011 International Conf. on 3D Imaging, Modeling, Processing, Visualization and Transmission (Curran Associates, Red Hook, NY, 2011), pp. 86–92.

Download references

ACKNOWLEDGMENTS

The authors are grateful to Ya.I. Belevitnev, as well as the crew and captain of the R/V Akademik Mstislav Keldysh Yu.N. Gorbach for help in preparing and conducting experimental studies.

Funding

The study was carried out under the state assignment of the IO RAS (topic no. 0128-2021-0010) with the support of the Russian Foundation for Basic Research (project nos. 20-05-00384 A and 18-05-60 070).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to I. M. Anisimov.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Anisimov, I.M., Tronza, S.N. Integration of Synced Side-Scan Sonar and Video Data for Seafloor Investigations. Oceanology 61, 423–432 (2021). https://doi.org/10.1134/S0001437021030024

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S0001437021030024

Keywords:

Navigation