Skip to main content
Log in

Accurate visual tracking via reliable patch

  • Original article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

To tackle the problem that traditional particle-filter- or correlation-filter-based trackers are prone to low tracking accuracy and poor robustness when the target faces challenges such as occlusion, rotation and scale variation in the case of complex scenes, an accurate reliable-patch-based tracker is proposed through exploiting and complementing the advantages of particle filter and correlation filter. Specifically, to cope with the challenge of continuous full occlusion, the target is divided into numerous patches by combining random with hand-crafted partition methods, and then, an effective target position estimation strategy is presented. Subsequently, according to the motion law between the patch and global target in the particle filter framework, two effective resampling rules are designed to remove unreliable particles to avoid tracking drift, and then, the target position can be estimated by the most reliable patches identified. Finally, an effective scale estimation approach is presented, in which the Manhattan distance between the reliable patches is utilized to estimate the target scale, including the target width and height, respectively. Experimental results illustrate that our tracker can not only be robust against the challenges of occlusion, rotation and scale variation, but also outperform state-of-the-art trackers for comparison in overall performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Zhou, L., Tokekar, P.: Active target tracking with self-triggered communications in multi-robot teams. IEEE Trans. Autom. Sci. Eng. 16(3), 1085–1096 (2019)

    Article  Google Scholar 

  2. Ferri, G., Munaf, A., LePage, K.D.: An autonomous underwater vehicle data-driven control strategy for target tracking. IEEE J. Oceanic Eng. 43(2), 323–343 (2018)

    Article  Google Scholar 

  3. Huang, D., Kong, L., Zhu, J., et al.: Improved action decision network for visual tracking with meta-learning. IEEE Access. 7, 117206–117218 (2019)

    Article  Google Scholar 

  4. B, C., Wu, Y., Ling, H., et al.: Real time robust L1 tracker using accelerated proximal gradient approach. Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (CVPR). pp. 1830–1837 (2012).

  5. Zhang, T., Ghanem, B., Liu, S., et al.: Low-rank sparse learning for robust visual tracking. Proc. Eur. Conf. Comput. Vis. 7577, 470–484 (2012)

    Google Scholar 

  6. Zhang, T., Liu, S., Ahuja, N., et al.: Robust visual tracking via consistent low-rank sparse learning. Int. J. Comput. Vision 111(2), 171–190 (2015)

    Article  Google Scholar 

  7. Yang, Y., Hu, W., Xie, Y., et al.: Temporal restricted visual tracking via reverse-low-rank sparse learning. IEEE Trans. Cybern. 47(2), 485–498 (2017)

    Google Scholar 

  8. Bolme, D., Beveridge, J.R., Draper, B.A., et al.: Visual object tracking using adaptive correlation filters. In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (CVPR), San Francisco, CA, USA. pp. 2544–2550 (2010).

  9. Henriques, J.F., Caseiro, R., Martins, P., et al.: High speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2014)

    Article  Google Scholar 

  10. Danelljan, M., H¨ager, G., Khan, F.S.: Accurate scale estimation for robust visual tracking. British Machine Vision Conference. pp. 65.1–65.11 (2014).

  11. Yang, L., Zhu, J., Hoi, S.C.H.: Reliable patch trackers: Robust visual tracking by exploiting reliable patches. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Boston, MA, USA. pp. 353–361 (2015).

  12. Oh, S., Russell, S., Sastry, S.: Markov chain Monte Carlodata association for target tracking. IEEE Trans. Autom. Control pp. 481–497 (2009).

  13. Wan, M., Gu, G., Qian, W., et al.: Un-manned aerial vehicle video-based target tracking algorithm using sparse representation. IEEE Internet Things J. 6(6), 9689–9706 (2019)

    Article  Google Scholar 

  14. Sangale, S.P., Rahane, S.B.: Live object monitoring, detection and tracking using mean shift and particle filters. In International Conference on Inventive Computation Technologies IEEE (2017).

  15. Chong, Y., Wang, Z., Rong, C., et al.: A particle filter infrared target tracking method based on multi-feature adaptive fusion. Geomat. Inf. Sci. Wuhan Univ. 41(5), 598–604 (2016)

    Google Scholar 

  16. Bie, X., Liu, H., Chang, F., et al.: Multi-target tracking method based on the adaptive fragment and multi- feature fusion. J. Xidian Univ. 44(2), 151–157 (2017)

    Google Scholar 

  17. Liu, T.: Adaptive hierarchical particle filter in dynamic tracking scenarios. J. Electr. Compon. Inf. Technol. 1(1), 17–21 (2018)

    Google Scholar 

  18. Wang, X., Hou, Z., Yu, W., et al.: Online scale adaptive visual tracking based on multilayer convolutional features. IEEE Trans. Cybern. 49(1), 146–158 (2019)

    Article  Google Scholar 

  19. Valmadre, J., Bertinetto, L., Henriques, J.F., et al.: End-to-End Representation Learning for Correlation Filter Based Tracking. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR). pp. 5000C5008 (2017).

  20. Yuan, D., Kang, W., He, Z.: Robust visual tracking with correlation filters and metric learning. Knowledge-Based Systems (2020).

  21. Abbass, M.Y., Kwon, K., Kim, N. et al.: Efficient object tracking using hierarchical convolutional features model and correlation filters. Visual Computer (2020).

  22. Wang, M., Liu, Y., Huang, Z.: Large Margin Object Tracking with Circulant Feature Maps. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR). pp. 4021–4029 (2017).

  23. Ji, Z., Feng, K., Qian, Y., et al.: Part-based visual tracking via structural support correlation filter. J. Vis. Commun. Image Represent. (2019).

  24. Zhai, Y., Song, P., Mou, Z., et al.: Occlusion-aware correlation particle filter target tracking based on RGBD data. IEEE Access. 6, 50752–50764 (2018)

    Article  Google Scholar 

  25. Zhang, T., Xu, C., Yang, M.: Learning multi-task correlation particle filters for visual tracking. IEEE Trans. Pattern Anal. Mach. Intell. 41(2), 365–378 (2019)

    Article  Google Scholar 

  26. Dai, M., Xiao, G., Cheng, S., et al.: Structural correlation filters combined with a Gaussian particle filter for hierarchical visual tracking. Neurocomputing (2020).

  27. Akin, O., Erdem, E., Erdem, A., et al.: Deformable part-based tracking by coupled global and local correlation filters. J. Vis. Commun. Image Represent. 38, 763–774 (2016)

    Article  Google Scholar 

  28. Liu, S.: Structural Correlation Filter for Robust Visual Tracking. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Las Vegas, NV, USA. pp. 4312–4320 (2016).

  29. Sun, X., Cheung, N.M., Yao, H., et al.: Non-rigid Object Tracking via Deformable Patches Using Shape-Preserved KCF and Level Sets. IEEE International Conference on Computer Vision (ICCV). Venice, Italy. pp. 5496–5504 (2017).

  30. Liu, T., Wang, G., Yang, Q.: Real-time part-based visual tracking via adaptive correlation filters. IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2015).

  31. Fan, H., Xiang, J.: Robust visual tracking via local-global correlation filter. Association for the Advance of Artificial Intelligence (2017).

  32. Voigtlaender, P., Luiten, J., Torr, P.H.S. et al.: Siam RCNN: Visual Tracking by Re-Detection. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2020).

  33. Chen, B., Li, P., Sun, C., et al.: Multi attention module for visual tracking. Pattern Recognit. pp. 80–93 (2019).

  34. Liu, Q., Li, X., He, Z., et al.: Learning Deep Multi-Level Similarity for Thermal Infrared Object Tracking. IEEE Transactions on Multimedia (2020).

  35. Liu, Q., Lu, X., Zhang, C., et al.: Deep convolutional neural networks for thermal infrared object tracking. Knowledge Based Systems (2017).

  36. Yuan, D., Li, X., He, Z., et al.: Visual object tracking with adaptive structural convolutional network. Knowledge Based Systems (2020).

  37. Danelljan, M., Gool, L.V., Timofte, R.: Probabilistic regression for visual tracking. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (2020).

  38. Nam, H., Han, B.: Learning multi-domain convolutional neural networks for visual tracking. IEEE Conference on Computer Vision and Pattern Recognition (CVPR). pp. 4293–4302 (2016).

  39. Lu, X., Ma, C., Ni, B., et al.: Deep Regression Tracking with Shrinkage Loss. 15th European Conference. Munich, Germany. pp. 8–14 (2018).

  40. Liuand, G., Liu, G.: ntegrating multi-level convolutional features for correlation filter tracking. International Conference on Image Processing. pp. 3029–3033 (2018).

  41. Bertinetto, L., Valmadre, J., Henriques, J.F., et al.: Fully convolutional siamese networks for object tracking. European Conference on Computer Vision, Cham: Springer. pp. 850–865 (2016).

  42. Li, B., Yan, J., Wu, W., et al.: High performance visual tracking with siamese region proposal network. In IEEE Conference on Computer Vision and Pattern Recognition. pp. 8971–8980 (2018).

  43. Li, B., Wu, W., Wang, Q. et al.: SiamRPN++: Evolution of siamese visual tracking with very deep networks. In IEEE Conference on Computer Vision and Pattern Recognition (2019).

  44. Guo, D., Wang, J., Zhao, W., et al.: End-to-end feature fusion siamese network for adaptive visual tracking. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2020).

  45. Zhang, W. Du, Y., Chen, Z. et al.: Robust adaptive learning with Siamese network architecture for visual tracking. Vis. Comput. (2020).

  46. Li, C., Lin, S., Qiao, J., et al.: Partial tracking method based on siamese network. Vis. Comput. (2020).

  47. Xu, Y., Wang, J., Li, H., et al.: Patch-based scale calculation for real-time visual tracking. IEEE Sig. Process. Lett. (2015).

  48. Li, Y., Zhu, J.A.: Scale Adaptive Kernel Correlation Filter Tracker with Feature Integration. In Proceedings of European Conference on Computer Vision (ECCV), Vol. 8926, pp. 254–265. Springer, Cham (2014).

  49. Wang, Q., Gao, J., Xing, J., et al.: DCFNet: Discriminant Correlation Filters Network for Visual Tracking. Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA (2017).

  50. Zhang, J., Ma, S., Sclaroff, S.: MEEM: Robust tracking via multiple experts using entropy minimization. In European Conference on Computer Vision (ECCV), Vol. 8694, pp. 188–203. Springer, Cham (2014).

  51. Bertinetto, L., Valmadre, J., Golodetz, S. et al.: Staple: Complementary learners for real-time tracking. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA (2016).

  52. Wu, Y., Lim, J., Yang, M.H.: Object Tracking Benchmark. TPAMI (2015).

  53. Wu, Y., Lim, J., Yang, M.H.: Online Object Tracking: A Benchmark. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Portland, OR, USA. pp. 2411–2418 (2013).

  54. Matej, K., Ales, L., Jiri, M., et al.: The sixth visual object tracking VOT2018 challenge results. European Conference on Computer Vision (ECCV) (2018).

Download references

Acknowledgements

This work was partially supported by the National Natural Science Foundation of China under Grant No. 61901183, 61976098 and 61602191, Natural Science Foundation of Fujian Province under Grant No. 2019J01010561 and Science and Technology Bureau of Quanzhou under Grant No. 2017G046.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Detian Huang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yang, M., Lin, Y., Huang, D. et al. Accurate visual tracking via reliable patch. Vis Comput 38, 625–638 (2022). https://doi.org/10.1007/s00371-020-02038-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-020-02038-6

Keywords

Navigation