Skip to main content
Log in

High-frame-rate Video-based Multicopter Tracking System Using Pixel-level Short-time Fourier Transform

  • Short Paper
  • Published:
Journal of Intelligent & Robotic Systems Aims and scope Submit manuscript

Abstract

In this study, we develop a telephoto pan-tilt drone search and track system for multicopters flying in a wide area spanning hundreds of meters, situated at a distance of hundreds of meters. It can detect periodic brightness changes around the drone propellers, which rotate at a high speed, in a high-frame-rate (HFR) video. The temporal frequency responses of the brightness signals in the HFR video are computed by performing pixel-level short-time Fourier transforms (STFTs) of the signals. By detecting the peak frequencies, the drone propellers are localized as vibration sources, and their rotation speed is estimated to monitor the flight status of the drone. For real-time localization and flight monitoring, the proposed system can perform pixel-level STFTs in a 500 fps video of 720×540 pixels using video processing accelerated by graphic processing units. This allows a multicopter to be tracked in real time at the center of the camera view by a galvanomirror pan-tilt active vision system with visual feedback. We verified its effectiveness by examining HFR videos for flying multicopters of different appearance, and conducted tracking experiments in outdoor scenes involving multicopters flying at an altitude of 70 m and 200 m ahead in a mountainous background.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Anti-drone market by technology (laser, kinetic, and electronics), application (detection and detection & disruption), vertical (military & defense, homeland security, and commercial), and geography - global forecast to 2023. Marketsandmarkets (2017)

  2. Commercial drone market size, share & trends analysis report by application (filming & photography, inspection & maintenance), by product (fixed-wing, rotary blade hybrid), by end use, and segment forecasts, 2019–2025, Grand view research (2019)

  3. Dedrone, DroneDNA. [Online]. Available: https://www.dedrone.com/products/dronedna. Accessed 24 Jun 2020 (2019)

  4. Droneshield, Drone Sentinel. [Online]. Available: https://www.droneshield.com/sentinel. Accessed 24 Jun 2020 (2019)

  5. AUDS, AUDS ANTI-UAV Defence System. [Online]. Available: http://www.blighter.com/products/auds-anti-uav-defence-system.html. Accessed 24 Jun 2020 (2017)

  6. SKYLOCK, Drone detection systems. [Online]. Available: https://www.skylock1.com/drone-detection/. Accessed 24 Jun 2020 (2020)

  7. Shi, X., Yang, C., Xie, W., Liang, C., Shi, Z., Chen, J.: Anti-drone system with multiple surveillance technologies: architecture, implementation, and challenges. IEEE Commun. Mag. 56(4), 68–74 (2018)

    Article  Google Scholar 

  8. Yilmaz, A., Javed, O., Shah, M.: Object tracking: a survey. ACM Comput. Surv. 38(4) (2006)

  9. Li, X., Hu, W., Shen, C., Zhang, Z., Dick, A., Hengel, A.: A survey of appearance models in visual object tracking. ACM Trans. Intell. Syst. Technol. 4(4) (2013)

  10. Wu, Y., Lim, J., Yang, M.H.: Object tracking benchmark. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1834–1848 (2015)

    Article  Google Scholar 

  11. Cavaliere, D., Loia, V., Saggese, A., Senatore, S., Vento, M.: A human-like description of scene events for a proper UAV-based video content analysis. Knowledge-Based Syst. 178, 163–175 (2019)

    Article  Google Scholar 

  12. Cong, Y., Fan, B., Liu, J., Luo, J., Yu, H.: Speeded up low-rank online metric learning for object tracking. IEEE Trans. Circ. Syst. Video Technol. 25(6), 922–934 (2015)

    Article  Google Scholar 

  13. Lee, Y., Tang, Z., cameras, J. Hwang.: Online-learning-based human tracking across non-overlapping. IEEE Trans. Circ. Syst. Video Technol. 28(10), 2870–2883 (2017)

    Article  Google Scholar 

  14. Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S.: SSD: single shot multibox detector. In: Proc. Eur. Conf. Comput. Vision., pp 21–37 (2016)

    Google Scholar 

  15. Ren, S., He, K., Girshick, R., Sun, J.: Faster R-CNN: toward real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 39(6), 1137–1149 (2015)

    Article  Google Scholar 

  16. Redmon, J., Farhadi, A.: YOLOv3: An incremental improvement. arXiv:http://arxiv.org/abs/1804.02767(2018)

  17. Jiang, M., Aoyama, T., Takaki, T., Ishii, I.: Pixel-level and robust vibration source sensing in high-frame-rate video analysis. Sensors 16(11), 1842 (2016)

    Article  Google Scholar 

  18. Jiang, M., Gu, Q., Aoyama, T., Takaki, T., Ishii, I.: Real-time vibration source tracking using high-speed vision. IEEE Sensors J. 17(5), 1513–1527 (2017)

    Article  Google Scholar 

  19. Fujiwara, N., Shimasaki, K., Jiang, M., Takaki, T., Ishii, I.: A real-time drone surveillance system using pixel-level short-time Fourier transform. In: Proc. IEEE Int. Symp. Saf. Secur. Rescue Robot., pp 303–308 (2019)

    Google Scholar 

  20. Schroder, A., Renker, M., Aulenbacher, U., Murk, A., Boniger, U., Oechslin, R., Wellig, P.: Numerical and experimental radar cross section analysis of the quadrocopter DJI Phantom 2. In: Proc. IEEE Radar Conf., pp 463–468 (2015)

    Google Scholar 

  21. Hoffmann, F., Ritchie, M., Fioranelli, F., Charlish, A., Griffiths, H.: Micro-doppler based detection and tracking of UAVs with multistatic radar. In: Proc. IEEE Radar Conf., pp 1–6 (2016)

    Google Scholar 

  22. Farlik, V.S.J., Kratky, M., Casar, J.: Radar cross section and detection of small unmanned aerial vehicles. In: IEEE Int. Conf. Mechatronics., pp 5–7 (2016)

    Google Scholar 

  23. Li, C.J., Ling, H.: An investigation on the radar signatures of small consumer drones. IEEE Antennas Wirel. Propag. Lett. 16, 649–652 (2017)

    Article  Google Scholar 

  24. Poitevin, P., Pelletier, M., Lamontagne, P.: Challenges in detecting UAS with radar. In: Proc. Int. Carnahan Conf. Secur. Technol., pp 1–6 (2017)

    Google Scholar 

  25. De Quevedo, A.D., Urzaiz, F.I., Menoyo, J.G., Lopez, A.A.: Drone detection and RCS measurements with ubiquitous radar. In: Proc. Int. Conf. Radar., pp 1–6 (2018)

    Google Scholar 

  26. Pham, T., Srour, N.: TTCP AG-6: Acoustic detection and tracking of UAVs. In: Proc. SPIE Secur. Defense., pp 24–30 (2004)

    Google Scholar 

  27. Busset, J., Perrodin, F., Wellig, P., Ott, B., Heutschi, K., Rühl, T., Nussbaumer, T.: Detection and tracking of drones using advanced acoustic cameras. In: Proc. SPIE Secur. Defense., pp 96470F–96470F (2015)

    Google Scholar 

  28. Mezei, J., Fiaska, V., Molnr, A.: Drone sound detection. In: IEEE Int. Symp. Comput. Intell. Inform, pp 333–338 (2015)

    Google Scholar 

  29. Christnacher, F., et al.: Optical and acoustical UAV detection. In: Proc. SPIE Secur. Defense., vol. 9988, pp 99880B-1–99880B-13 (2016)

    Google Scholar 

  30. Drone bouncer, Orelia Drone-Detector. [Online]. Available: http://dronebouncer.com/en/orelia-drone-detector. Accessed 24 Jun 2020 (2020)

  31. Dronelabs, Drone Detector. [Online]. Available: http://www.dronedetector.com/compare-detection-systems. Accessed 24 Jun 2020 (2020)

  32. Shi, Z., Chang, X., Yang, C., Wu, Z., Wu, J.: An acoustic-based surveillance system for amateur drones detection and localization. IEEE Trans. Veh. Technol. 69(3), 2731–2739 (2020)

    Article  Google Scholar 

  33. DDC, Domestic drone countermeasures. [Online]. Available: http://www.ddcountermeasures.com/products.html. Accessed 24 Jun 2020 (2020)

  34. Rohde&Schwarz, R&S ARDRONIS. [Online]. Available: https://www.rohde-schwarz.com/us/products/aerospace-defense-security/countering-drones/ardronis-overview_250881.html. Accessed 24 Jun 2020 (2020)

  35. Peacock, M., Johnstone, M.N.: Towards detection and control of civilian unmanned aerial vehicles. In: Proc. 14th Aust. Inf. Warfare Secur. Conf., pp 9–15 (2013)

    Google Scholar 

  36. Nguyen, P., Ravindranatha, M., Nguyen, A., Han, R., Vu, T.: Investigating cost-effective RF-based detection of drones. In: Proc. ACM. Work. Micro Aerial Veh. Netw. Syst. Appl. Civilian Use., pp 17–22 (2016)

    Google Scholar 

  37. Drozdowicz, J., Wielgo, M., Samczynski, P., Kulpa, K., Krzonkalla, J., Mordzonek, M., Bryl, M., Jakielaszek, Z.: 35 GHz FMCW drone detection system. In: Int. Radar Symp., pp 1–4 (2016)

    Google Scholar 

  38. Rozantsev, A., Lepetit, V., Fua, P.: Flying objects detection from a single moving camera. In: Proc. IEEE Conf. Comput. Vision. Pattern Recognit., pp 4128–4136 (2015)

    Google Scholar 

  39. Zhang, Z., Cao, Y., Ding, M., Zhuang, L., Yao, W.: An intruder detection algorithm for vision based sense and avoid system. In: Proc. IEEE Int. Conf. Unmanned Aircr. Syst., pp 550–556 (2016)

    Google Scholar 

  40. Ganti, S.R., Kim, Y.: Implementation of detection and tracking mechanism for small UAS. In: Proc. IEEE Int. Conf. Unmanned Aircr. Syst., pp 1254–1260 (2016)

    Google Scholar 

  41. Wu, Y., Sui, Y., Wang, G.: Vision-based real-time aerial object localization and tracking for UAV sensing system. IEEE Access 5, 23969–23978 (2017)

    Article  Google Scholar 

  42. Rozantsev, A., Lepetit, V., Fua, P.: Detecting flying objects using a single moving camera. IEEE Trans. Pattern Anal. Mach. Intell. 39(5), 879–892 (2017)

    Article  Google Scholar 

  43. Saqib, M., Khan, S.D., Sharma, N., Blumenstein, M.: A study on detecting drones using deep convolutional neural networks. In: Proc. IEEE Int. Conf. Adv. Video Signal Based Surveillance, pp 1–5 (2017)

    Google Scholar 

  44. Aker, C., Kalkan, S.: Using deep networks for drone detection. In: Proc. IEEE Int. Conf. Adv. Video Signal Based Surveillance, pp 1–6 (2017)

    Google Scholar 

  45. Schumann, A., Sommer, L., Klatte, J., Schuchert, T., Beyerer, J.: Deep cross-domain flying object classification for robust UAV detection. In: Proc. IEEE Int. Conf. Adv. Video. Signal Based Surveillance, pp 1–6 (2017)

    Google Scholar 

  46. Lin, C.: DroneNet. [Online] Available: https://github.com/chuanenlin/drone-net. Accessed 24 Jun 2020 (2018)

  47. Zsedrovits, T., Zarndy, A., Vanek, B., Pni, T., Bokor, J., Roska, T.: Collision avoidance for UAV using visual detection. In: IEEE Int. Symp. Circuits Syst., pp 2173–2176 (2011)

    Google Scholar 

  48. Lai, J., Mejias, L., Ford, J.J.: Airborne vision-based collision detection system. J. Field Robot. 28(2), 137–157 ( 2011)

    Article  MATH  Google Scholar 

  49. Lopez, B.T., How, J.P.: Aggressive 3-d collision avoidance for high-speed navigation. In: Proc. IEEE Int. Conf. Robot. Automat., pp 5759–5765 (2017)

    Google Scholar 

  50. Andrasi, P., Radisic, T., Mustra, M., Ivosevic, J.: Night-time detection of UAVs using thermal infrared camera. INAIR. 28, 183–190 (2017)

    Google Scholar 

  51. Sosnowski, T., Bieszczad, G., Madura, H., Kastek, M.: Thermovision system for flying objects detection. In: Baltic URSI Symp., pp 141–144 (2018)

    Google Scholar 

  52. Caetano, E., Silva, S., Bateira, J.: A vision system for vibration monitoring of civil engineering structures. Exp. Techn. 35(4), 74–82 (2011)

    Article  Google Scholar 

  53. Maas, H.G., Hampel, U.: Photogrammetric techniques in civil engineering material testing and structure monitoring. Photogram. Eng. Remote Sens. 72(1), 39–45 (2006)

    Article  Google Scholar 

  54. Chen, J.G., Wadhwa, N., Durand, F., Freeman, W.T., Buyukozturk, O.: Developments with motion magnification for structural modal identification through camera video. Dyn. Civil Struct. 2, 49–57 (2015)

    Google Scholar 

  55. Ishii, I., Tatebe, T., Gu, Q., Moriue, Y., Takaki, T., Tajima, K.: 2000 fps real-time vision system with high-frame-rate video recording. In: Proc. IEEE Int. Conf. Robot. Automat., pp 1536–1541 (2010)

    Google Scholar 

  56. Sharma, A., Shimasaki, K., Gu, Q., Chen, J., Aoyama, T., Takaki, T., Ishii, I.: Super high-speed vision platform that can process 1024×1024 images in real time at 12500 fps. In: Proc. IEEE/SICE Int. Symp. Syst. Integr., pp 544–549 (2016)

    Google Scholar 

  57. Yamazaki, T., Katayama, H., Uehara, S., Nose, A., Kobayashi, M., Shida, S., Odahara, M., Takamiya, K., Hisamatsu, Y., Matsumoto, S., Miyashita, L., Watanabe, Y., Izawa, T., Muramatsu, Y.: M. Ishikawa. A 1ms high-speed vision chip with 3D-stacked 140GOPS column-parallel PEs for spatio-temporal image processing. In: Tech. Dig IEEE Solid-State Circuits Conf., pp 82–83 (2017)

    Google Scholar 

  58. Ishii, I., Taniguchi, T., Yamamoto, K., Takaki, T.: High-frame-rate optical flow system. IEEE Trans. Circuits Syst. Video Technol. 22(1), 105–112 (2012)

    Article  Google Scholar 

  59. Ishii, I., Ichida, T., Gu, Q., Takaki, T.: 500-fps face tracking system. J. Real-time Image Process. 8(4), 379–388 (2013)

    Article  Google Scholar 

  60. Okumura, K., Yokoyama, K., Oku, H., Ishikawa, M.: 1ms auto pan-tilt video shooting technology for objects in motion based on saccade mirror with background subtraction. Adv. Robot. 29(7), 457–468 (2015)

    Article  Google Scholar 

  61. Namiki, A., Imai, Y., Kaneko, M., Ishikawa, M.: Development of a high-speed multifingered hand system and its application to catching. In: Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., pp 2666–2671 (2003)

    Google Scholar 

  62. Nakamura, Y., Kishi, K., Kawakami, H.: Heartbeat synchronization for robotic cardiac surgery, pp 2014–2019 (2001)

  63. Ishii, I., Kurozumi, S., Orito, K., Matsuda, H.: Automatic scratching pattern detection for laboratory mice using high-speed video images. IEEE Trans. Automat. Sci. Eng. 5(1), 176–182 (2008)

    Article  Google Scholar 

  64. Nie, Y., Ishii, I., Yamamoto, K., Orito, K., Matsuda, H.: Real-time scratching behavior quantification system for laboratory mice using high-speed vision. J. Real Time Image Process. 4(2), 181–190 (2009)

    Article  Google Scholar 

  65. Shimasaki, K., Jiang, M., Takaki, T., Ishii, I., Yamamoto, K.: HFR-video-based honeybee activity sensing using pixel-level short-time Fourier transform. In: Proc. IEEE Sensors 2018, pp 812–815 (2018)

    Google Scholar 

  66. Shimasaki, K., Jiang, M., Takaki, T., Ishii, I., Yamamoto, K.: HFR-video-based honeybee activity sensing. IEEE Sensors J. 20(10), 5575–5587 (2020)

    Article  Google Scholar 

  67. Sakuma, S., Kuroda, K., Tsai, C., Fukui, W., Arai, F., Kaneko, M.: Red blood cell fatigue evaluation based on the close-encountering point between extensibility and recoverability. Lab Chip. 14, 1135–1141 (2014)

    Article  Google Scholar 

  68. Gu, Q., Aoyama, T., Takaki, T., Ishii, I.: Simultaneous vision-based shape and motion analysis of cells fast-flowing in a microchannel. IEEE Trans. Automat. Sci. Eng. 12(1), 204–215 (2015)

    Article  Google Scholar 

  69. Gu, Q., Kawahara, T., Aoyama, T., Takaki, T., Ishii, I., Takemoto, A., Sakamoto, N.: LOC-Based High-throughput cell morphology analysis system. IEEE Trans. Automat. Sci. Eng. 12(4), 1346–1356 (2015)

    Article  Google Scholar 

  70. Yang, H., Gu, Q., Aoyama, T., Takaki, T., Ishii, I.: Dynamics-based stereo visual inspection using multidimensional modal analysis. IEEE Sensors. J. 13(12), 4831–4843 (2013)

    Article  Google Scholar 

  71. Aoyama, T., Li, L., Jiang, M., Inoue, K., Takaki, T., Ishii, I., Yang, H., Umemoto, C., Matsuda, H., Chikaraishi, M., Fujiwara, A.: Vibration sensing of a bridge model using a multithread active vision system. IEEE/ASME Trans. Mechatronics. 23(1), 179–189 (2018)

    Article  Google Scholar 

  72. Aoyama, T., Li, L., Jiang, M., Takaki, T., Ishii, I., Yang, H., Umemoto, C., Matsuda, H., Chikaraishi, M., Fujiwara, A.: Vision-based modal analysis using multiple vibration distribution synthesis to inspect large-scale structures. J. Dyn. Syst. Meas. Control 141(3), 031007–1–031007-12 (2019)

    Article  Google Scholar 

  73. Gu, Q., Takaki, T., Ishii, I.: A fast multi-object extraction algorithm based on cell-based connected components labeling. IEICE Trans. Inform. Syst. E95-D(2), 636–645 (2012)

    Article  Google Scholar 

Download references

Funding

This work was supported in part by MIC (Ministry of Internal Affairs and Communications, Japan) Strategic Information and Communications R&D Promotion Programme under Grant 181608001.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Idaku Ishii.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shimasaki, K., Fujiwara, N., Hu, S. et al. High-frame-rate Video-based Multicopter Tracking System Using Pixel-level Short-time Fourier Transform. J Intell Robot Syst 103, 36 (2021). https://doi.org/10.1007/s10846-021-01483-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10846-021-01483-2

Keywords

Navigation