Abstract
Fusion of infrared and visible image is a technology which combines information from two different sensors for the same scene. It also gives extremely effective information complementation, which is widely used for the monitoring systems and military fields. Due to limited field depth in an imaging device, visible images can’t identify some targets that may not be apparent due to poor lighting conditions or because that the background color is similar to the target. To deal with this problem, a simple and efficient image fusion approach of infrared and visible images is proposed to extract target’s details from infrared images and enhance the vision in order to improve the performance of monitoring systems. This method depends on maximum and minimum operations in neutrosophic fuzzy sets. Firstly, the image is transformed from its spatial domain to the neutrosophic domain which is described by three membership sets: truth membership, indeterminacy membership, and falsity membership. The indeterminacy in the input data is handled to provide a comprehensive fusion result. Finally, deneutrosophicised process is made which means that the membership values are retransformed into a normal image space. At the end of the study, experimental results are applied to evaluate the performance of this approach and compare it to the recent image fusion methods using several objective evaluation criteria. These experiments demonstrate that the proposed method achieves outstanding visual performance and excellent objective indicators.
Similar content being viewed by others
Data Availability
The data used to support the findings of this study are included within the article.
References
https://github.com/hli1221/imagefusion_deeplearning/tree/master/IV_images
Bavirisetti DP, Dhuli R (2016) Fusion of infrared and visible sensor images based on anisotropic diffusion and Karhunen-Loeve transform. IEEE Sensors J 16(1):203–209
Bavirisetti DP, Dhuli R (2016) Two-scale image fusion of visible and infrared images using saliency detection. Infrared Phys Technol 76:52–64
Bavirisetti DP, Xiao G, Liu G (2017) Multi-sensor image fusion based on fourth order partial differential equations. In: Proceedings of the international conference on information fusion, pp 1–9
Choi M, Kim RY, Nam M-R, Kim HO (2005) Fusion of multispectral and panchromatic satellite images using the curvelet transform. IEEE Geosci Remote Sensing Lett 2(2):136–140
Cheng HD, Guo Y (2008) A new Neutrosophic approach to image thresholding. New Math Nat Comput 4(3):291–308
Guo Y, Cheng HD (2009) A new neutrosophic approach to image denoising. New Math Nat Comput 5:653–662
Guo Y, Cheng HD (2009) New neutrosophic approach to image segmentation. Pattern Recogn 4:587–95
Kaur A, Sharma R (2016) Stationary wavelet transform image fusion and optimization using particle swarm optimization. IOSR J Comput Eng 18 (3):32–38
Kumar BS (2015) Image fusion based on pixel significance using cross bilateral filter. Signal Image Video Process 9(5):1193–1204
Kong W, Zhang L, Lei Y (2014) Novel fusion method for visible light and infrared images based on nsst–sf–pcnn. Infrared Phys Technol 65:103–112
Kong W, Lei Y, Zhao H (2014) Adaptive fusion method of visible light and infrared images based on non-subsampled shearlet transform and fast non-negative matrix factorization. Infrared Phys Technol 67:161–172
Li S, Yang B, Hu J (2011) Performance comparison of different multi-resolution transforms for image fusion. Inform Fusion 12(2):74–84
Liu Y, Liu S, Wang Z (2015) A general framework for image fusion based on multi-scale transform and sparse representation. Inform Fusion 24:147–164
Li H, Wu X, Kittler J (2018) Infrared and visible image fusion using a deep learning framework. International Conference on Pattern Recognition (ICPR), pp 2705–2710
Liu Y, Jin J, Wang Q, Shen Y, Dong X (2014) Region level based multi-focus image fusion using quaternion wavelet and normalized cut. Signal Process 97:9–30
Ma J, Yu W, Liang P, Li C, Jiang J (2019) FusionGAN: a generative adversarial network for infrared and visible image fusion. Inf Fusion 48:11–26
Ma J, Zhou Z, Wang B, Zong H (2017) Infrared and visible image fusion based on visual salienc map and weighted least square optimization. Infrared Phys Technol 82:8–17
Ma J, Chen C, Li C, Huang J (2016) Infrared and visible image fusion via gradient transfer and total variation minimization. Inform Fusion 31:100–109
Pajares G, De La Cruz JM (2004) A wavelet-based image fusion tutorial. Pattern Recognit 37(9):1855–1872
Rajkumar S, Mouli PC (2014) Infrared and visible image fusion using entropy and neuro-fuzzy concepts. In: Proceedings of the annual convention of computer society of India, pp 93–100
Saeedi J, Faez K (2012) Infrared and visible image fusion using fuzzy logic and population-based optimization. Appl Soft Comput 12(3):1041–1054
Smarandache F (1998) Neutrosophy, Neutrosophic Probability, Set and Logic, Amer. Res. Press, Rehoboth, USA, pp 105
Toet A (1989) Image fusion by a ratio of low-pass pyramid. Pattern Recognit Lett 9(4):245–253
Xiang T, Yan L, Gao R (2015) A fusion algorithm for infrared and visible images based on adaptive dual-channel unit-linking PCNN in NCST domain. Infrared Phys Technol 69:53–61
Zadeh LA (1965) Fuzzy sets. Inf Control 8(3):338–353
Zhao J, Cui G, Gong X, Zang Y, Tao S, Wang D (2017) Fusion of visible and infrared images using global entropy and gradient constrained regularization. Infrared Phys Technol 81:201–209
Zhang Z, Blum RS (1999) A categorization of multiscale-decomposition-based image fusion schemes with a performance study for a digital camera application. Proc IEEE 87(8):1315–1326
Zhang X, Ma Y, Fan F, Zhang Y, Huang J (2017) Infrared and visible image fusion via saliency analysis and local edge-preserving multi-scale decomposition. JOSA A 34(8):1400–1410
Zhao J, Chen Y, Feng H, Xu Z, Li Q (2014) Infrared image enhancement through saliency feature analysis based on multi-scale decomposition. Infrared Phys Technol 62:86–93
Zhou Z, Wang B, Li S, Dong M (2016) Perceptual fusion of infrared and visible images through a hybrid multi-scale decomposition with gaussian and bilateral filters. Inform Fusion 30:15–26
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Ethics approval
This article does not contain any studies with human participants or animals performed by any of the authors.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Alghamdi, R.S., Alshehri, N.O. Fusion of infrared and visible images using neutrosophic fuzzy sets. Multimed Tools Appl 80, 25927–25941 (2021). https://doi.org/10.1007/s11042-021-10911-2
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-021-10911-2