Skip to main content
Log in

Fusion of infrared and visible images using neutrosophic fuzzy sets

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Fusion of infrared and visible image is a technology which combines information from two different sensors for the same scene. It also gives extremely effective information complementation, which is widely used for the monitoring systems and military fields. Due to limited field depth in an imaging device, visible images can’t identify some targets that may not be apparent due to poor lighting conditions or because that the background color is similar to the target. To deal with this problem, a simple and efficient image fusion approach of infrared and visible images is proposed to extract target’s details from infrared images and enhance the vision in order to improve the performance of monitoring systems. This method depends on maximum and minimum operations in neutrosophic fuzzy sets. Firstly, the image is transformed from its spatial domain to the neutrosophic domain which is described by three membership sets: truth membership, indeterminacy membership, and falsity membership. The indeterminacy in the input data is handled to provide a comprehensive fusion result. Finally, deneutrosophicised process is made which means that the membership values are retransformed into a normal image space. At the end of the study, experimental results are applied to evaluate the performance of this approach and compare it to the recent image fusion methods using several objective evaluation criteria. These experiments demonstrate that the proposed method achieves outstanding visual performance and excellent objective indicators.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Data Availability

The data used to support the findings of this study are included within the article.

References

  1. https://github.com/hli1221/imagefusion_deeplearning/tree/master/IV_images

  2. Bavirisetti DP, Dhuli R (2016) Fusion of infrared and visible sensor images based on anisotropic diffusion and Karhunen-Loeve transform. IEEE Sensors J 16(1):203–209

    Article  Google Scholar 

  3. Bavirisetti DP, Dhuli R (2016) Two-scale image fusion of visible and infrared images using saliency detection. Infrared Phys Technol 76:52–64

    Article  Google Scholar 

  4. Bavirisetti DP, Xiao G, Liu G (2017) Multi-sensor image fusion based on fourth order partial differential equations. In: Proceedings of the international conference on information fusion, pp 1–9

  5. Choi M, Kim RY, Nam M-R, Kim HO (2005) Fusion of multispectral and panchromatic satellite images using the curvelet transform. IEEE Geosci Remote Sensing Lett 2(2):136–140

    Article  Google Scholar 

  6. Cheng HD, Guo Y (2008) A new Neutrosophic approach to image thresholding. New Math Nat Comput 4(3):291–308

    Article  Google Scholar 

  7. Guo Y, Cheng HD (2009) A new neutrosophic approach to image denoising. New Math Nat Comput 5:653–662

    Article  Google Scholar 

  8. Guo Y, Cheng HD (2009) New neutrosophic approach to image segmentation. Pattern Recogn 4:587–95

    Article  Google Scholar 

  9. Kaur A, Sharma R (2016) Stationary wavelet transform image fusion and optimization using particle swarm optimization. IOSR J Comput Eng 18 (3):32–38

    Google Scholar 

  10. Kumar BS (2015) Image fusion based on pixel significance using cross bilateral filter. Signal Image Video Process 9(5):1193–1204

    Article  Google Scholar 

  11. Kong W, Zhang L, Lei Y (2014) Novel fusion method for visible light and infrared images based on nsst–sf–pcnn. Infrared Phys Technol 65:103–112

    Article  Google Scholar 

  12. Kong W, Lei Y, Zhao H (2014) Adaptive fusion method of visible light and infrared images based on non-subsampled shearlet transform and fast non-negative matrix factorization. Infrared Phys Technol 67:161–172

    Article  Google Scholar 

  13. Li S, Yang B, Hu J (2011) Performance comparison of different multi-resolution transforms for image fusion. Inform Fusion 12(2):74–84

    Article  Google Scholar 

  14. Liu Y, Liu S, Wang Z (2015) A general framework for image fusion based on multi-scale transform and sparse representation. Inform Fusion 24:147–164

    Article  Google Scholar 

  15. Li H, Wu X, Kittler J (2018) Infrared and visible image fusion using a deep learning framework. International Conference on Pattern Recognition (ICPR), pp 2705–2710

  16. Liu Y, Jin J, Wang Q, Shen Y, Dong X (2014) Region level based multi-focus image fusion using quaternion wavelet and normalized cut. Signal Process 97:9–30

    Article  Google Scholar 

  17. Ma J, Yu W, Liang P, Li C, Jiang J (2019) FusionGAN: a generative adversarial network for infrared and visible image fusion. Inf Fusion 48:11–26

    Article  Google Scholar 

  18. Ma J, Zhou Z, Wang B, Zong H (2017) Infrared and visible image fusion based on visual salienc map and weighted least square optimization. Infrared Phys Technol 82:8–17

    Article  Google Scholar 

  19. Ma J, Chen C, Li C, Huang J (2016) Infrared and visible image fusion via gradient transfer and total variation minimization. Inform Fusion 31:100–109

    Article  Google Scholar 

  20. Pajares G, De La Cruz JM (2004) A wavelet-based image fusion tutorial. Pattern Recognit 37(9):1855–1872

    Article  Google Scholar 

  21. Rajkumar S, Mouli PC (2014) Infrared and visible image fusion using entropy and neuro-fuzzy concepts. In: Proceedings of the annual convention of computer society of India, pp 93–100

  22. Saeedi J, Faez K (2012) Infrared and visible image fusion using fuzzy logic and population-based optimization. Appl Soft Comput 12(3):1041–1054

    Article  Google Scholar 

  23. Smarandache F (1998) Neutrosophy, Neutrosophic Probability, Set and Logic, Amer. Res. Press, Rehoboth, USA, pp 105

  24. Toet A (1989) Image fusion by a ratio of low-pass pyramid. Pattern Recognit Lett 9(4):245–253

    Article  Google Scholar 

  25. Xiang T, Yan L, Gao R (2015) A fusion algorithm for infrared and visible images based on adaptive dual-channel unit-linking PCNN in NCST domain. Infrared Phys Technol 69:53–61

    Article  Google Scholar 

  26. Zadeh LA (1965) Fuzzy sets. Inf Control 8(3):338–353

    Article  Google Scholar 

  27. Zhao J, Cui G, Gong X, Zang Y, Tao S, Wang D (2017) Fusion of visible and infrared images using global entropy and gradient constrained regularization. Infrared Phys Technol 81:201–209

    Article  Google Scholar 

  28. Zhang Z, Blum RS (1999) A categorization of multiscale-decomposition-based image fusion schemes with a performance study for a digital camera application. Proc IEEE 87(8):1315–1326

    Article  Google Scholar 

  29. Zhang X, Ma Y, Fan F, Zhang Y, Huang J (2017) Infrared and visible image fusion via saliency analysis and local edge-preserving multi-scale decomposition. JOSA A 34(8):1400–1410

    Article  Google Scholar 

  30. Zhao J, Chen Y, Feng H, Xu Z, Li Q (2014) Infrared image enhancement through saliency feature analysis based on multi-scale decomposition. Infrared Phys Technol 62:86–93

    Article  Google Scholar 

  31. Zhou Z, Wang B, Li S, Dong M (2016) Perceptual fusion of infrared and visible images through a hybrid multi-scale decomposition with gaussian and bilateral filters. Inform Fusion 30:15–26

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rania. S. Alghamdi.

Ethics declarations

Ethics approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Alghamdi, R.S., Alshehri, N.O. Fusion of infrared and visible images using neutrosophic fuzzy sets. Multimed Tools Appl 80, 25927–25941 (2021). https://doi.org/10.1007/s11042-021-10911-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-021-10911-2

Keywords

Navigation