402Accesses
2Citations
1Altmetric
Abstract
In this research, we propose a near-infrared (NIR) and visible (VIS) image fusion method based on saliency-map-guided multi-scale transform decomposition (SMG-MST) to solve the problem of color distortion. Although the existing NIR and VIS image fusion methods can enhance the texture information of the fused image, they cannot control the scattering of light from objects in the fused image resulting in color distortion. The color distortion region usually has good saliency, so using saliency map to solve the above problem is a good choice. In this paper, a visible image guided by saliency map is introduced in the low frequency part, which can weaken the scattering of too much light from objects in the image. In addition, the local entropy of the NIR is used to guide the visible photon images, so the results contain more details. Both qualitative and quantitative experiments demonstrate the effectiveness of our algorithm, and the comparison of algorithm running times shows the high efficiency of our method.
This is a preview of subscription content,log in via an institution to check access.
Access this article
Subscribe and save
- Get 10 units per month
- Download Article/Chapter or eBook
- 1 Unit = 1 Article or 1 Chapter
- Cancel anytime
Buy Now
Price includes VAT (Japan)
Instant access to the full article PDF.














Similar content being viewed by others
Data Availability
All public images in the manuscript can be found at:https://www.epfl.ch/labs/ivrl/research/downloads/rgb-nir-scene-dataset/.
References
Ancuti CO, Ancuti C (2013) Single image dehazing by multi-scale fusion. IEEE Trans Image Process 22(8):3271–3282
Ancuti C, O. Ancuti C (2014) Effective contrast-based dehazing for robust image matching. IEEE Geosci Remote Sens Lett 11(11):1871–1875
Bernal EA, Yang X, Li Q, Kumar J, Madhvanath S, Ramesh P, Bala R (2017) Deep temporal multimodal fusion for medical procedure monitoring using wearable sensors. IEEE Trans Multimedia, pp 1–1
Chen J, Li X, Luo L, Mei X, Ma J (2020) Infrared and visible image fusion based on target-enhanced multiscale transform decomposition. Inform Sci 508:64–78
Chen Q, Sun J, Palade V, Shi X, Liu L (2019) Hierarchical clustering based band selection algorithm for hyperspectral face recognition. IEEE Access 7:24333–24342
Colvero CP, Cordeiro MCR, De Faria GV, JP Von der Weid. (2005) Experimental comparison between far-and near-infrared wavelengths in free-space optical systems. Microw Opt Technol Lett 46(4):319–323
Cui G, Feng H, Xu Z, Li Q, Chen Y (2015) Detail preserved fusion of visible and infrared images using regional saliency extraction and multi-scale image decomposition. Opt Commun 341:199–209
Epfl database. Available at:https://www.epfl.ch/labs/ivrl/research/downloads/rgb-nir-scene-dataset/. Accessed 27 Feb 2023
Eskicioglu AM, Fisher PS (1995) Image quality measures and their performance. IEEE Trans Commun 43(12):2959–2965
Fattal R (2015) Dehazing using color-lines. ACM Trans Graph, vol 34(1)
Feng C, Zhuo S, Zhang X, Shen L, Süsstrunk S (2013) Near-infrared guided color image dehazing. In: 2013 IEEE international conference on image processing, pp 2363–2367. IEEE
Feng C, Zhuo S, Zhang X, Shen L, Süsstrunk S (2013) Near-infrared guided color image dehazing. In: 2013 IEEE international conference on image processing, pp 2363–2367
Fernandez-Beltran R, Haut J, Paoletti M, Plaza J, Plaza A, Pla F (2018) Remote sensing image fusion using hierarchical multimodal probabilistic latent semantic analysis. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 11(12):4982–4993
Fredembach C, Süsstrunk S (2008) Colouring the near-infrared. In: Color and imaging conference, vol 2008, pp 176–182. Society for Imaging Science and Technology
Gijsenij A, Gevers T, Van De Weijer J (2011) Computational color constancy: survey and experiments. IEEE Trans Image Process 20(9):2475–2489
Jang D, Park R (2017) Colour image dehazing using near-infrared fusion. IET Image Process 11(8):587–594
Jiang J, Feng X, Liu F, Xu Y, Huang H (2019) Multi-spectral rgb-nir image classification using double-channel cnn. IEEE Access 7:20607–20613
Lan X, Zhang L, Shen H, Yuan Q, Li H (2013) Single image haze removal considering sensor blur and noise. Springer, pp 1–13
Li Z, Hu H, Zhang W, Pu S, Li B (2020) Spectrum characteristics preserved visible and near-infrared image fusion algorithm. IEEE Trans Multimedia 23:306–319
Li S, Kang X, Hu J (2013) Image fusion with guided filtering. IEEE Trans Image Process 22(7):2864–2875
Li H, Wu X (2019) Densefuse: a fusion approach to infrared and visible images. IEEE Trans Image Process 28(5):2614–2623
Ma J, Ma Y, Li C (2019) Infrared and visible image fusion methods and applications: a survey. Information Fusion, pp 153–178
Ma J, Tang L, Fan F, Huang J, Mei X, Ma Y (2022) Swinfusion: cross-domain long-range learning for general image fusion via swin transformer. IEEE/CAA Journal of Automatica Sinica 9(7):1200–1217
Ma J, Xu H, Jiang J, Mei X, Zhang X (2020) Ddcgan: a dual-discriminator conditional generative adversarial network for multi-resolution image fusion. IEEE Trans Image Process 29:4980–4995
Ma J, Yu W, Liang P, Li C, Jiang J (2019) Fusiongan: a generative adversarial network for infrared and visible image fusion. Information Fusion 48:11–26
Mertens T, Kautz J, R. Van F (2007) Exposure fusion. In: 15th Pacific conference on computer graphics and applications (PG’07), pp 382–390
Nayar SK, Narasimhan SG (1999) Vision in bad weather. In: Proceedings of the Seventh IEEE international conference on computer vision, volume 2, pp 820–827, vol.2
Peter J, Edward H (1987) The laplacian pyramid as a compact image code. In: Martin A, Oscar F (eds) Readings in computer vision, pp 671–679. Morgan Kaufmann, San Francisco (CA)
Qu G, Zhang D, Yan P (2002) Information measure for performance of image fusion. Electron Lett 38(7):313–315
Ram Prabhakar K, Sai Srikar V, Venkatesh Babu R (2017) Deepfuse: a deep unsupervised approach for exposure fusion with extreme exposure image pairs. In: Proceedings of the IEEE international conference on computer vision (ICCV)
Sappa A, Carvajal J, Aguilera C, Oliveira M, Romero D, Vintimilla B (2016) Wavelet-based visible and infrared image fusion: a comparative study. Sensors 16(6):861
Schaul L, Fredembach C, Süsstrunk S (2009) Color image dehazing using the near-infrared. In: 2009 16th IEEE international conference on image processing (ICIP), pp 1629–1632. IEEE
Sulami M, Glatzer I, Fattal R, Werman M (2014) Automatic recovery of the atmospheric light in hazy images. In: 2014 IEEE international conference on computational photography (ICCP), pp 1–11
Tang L, Deng Y, Ma Y, Huang J, Ma J (2022) Superfusion: a versatile image registration and fusion network with semantic awareness. IEEE/CAA Journal of Automatica Sinica 9(12):2121–2137
Tang L, Yuan J, Ma J (2022) Image fusion in the loop of high-level vision tasks a semantic-aware real-time infrared and visible image fusion network. Inf Fusion 82:28–42
V. Vanmali A, M. Gadre V (2017) Visible and nir image fusion using weight-map-guided laplacian–gaussian pyramid for improving scene visibility. Springer 508:64–78
Vanmali A, Kelkar S, Gadre V (2015) A novel approach for image dehazing combining visible-nir images. In: 2015 Fifth national conference on computer vision, pattern recognition, image processing and graphics (NCVPRIPG), pp 1–4
Wesley RJ, Jan AAV, Fethi BA (2008) Assessment of image fusion procedures using entropy, image quality, and multispectral classification. J Appl Remote Sens 2(1):1–28
Xu H, Ma J, Jiang J, Guo X, Ling H (2022) U2fusion: a unified unsupervised image fusion network. IEEE Trans Pattern Anal Mach Intell 44(1):502–518
Xu H, Ma J, Zhang X-P (2020) Mef-gan: multi-exposure image fusion via generative adversarial networks. IEEE Trans Image Process 29:7203–7216
Zhang Y, Liu Y, Sun P, Yan H, Zhao X, Zhang L (2020) Ifcnn: a general image fusion framework based on convolutional neural network. Information Fusion 54:99–118
Zhang H, Ma J (2021) Sdnet :A versatile squeeze-and-decomposition network for real-time image fusion. Int J Comput Vis 129(10):2761–2785
Zhang X, Terence S, Miao X (2008) Enhancing photographs with near infra-red images. In: 2008 IEEE conference on computer vision and pattern recognition, pp 1–8
Zhu Q, Mai J, Shao L (2015) A fast single image haze removal algorithm using color attenuation prior. IEEE Trans Image Process 24(11):3522–3533
Acknowledgements
This work was supported by the National Natural Science Foundation of China nos. 62073304, 41977242 and 61973283.
Author information
Authors and Affiliations
School of Automation, China University of Geosciences, Wuhan, 430074, China
Chen Jun, Cai Lei & Liu Wei
Hubei Key Laboratory of Advanced Control and Intelligent Automation for Complex Systems, Wuhan, China
Chen Jun, Cai Lei & Liu Wei
Engineering Research Center of Intelligent Technology for Geo-Exploration, Ministry of Education, Wuhan, China
Chen Jun, Cai Lei & Liu Wei
Chinese Academy of Sciences, Shanghai Institute of Technical Physics, Shanghai, 200083, China
Yu Yang
Key Laboratory of Infrared System Detecting and Imaging Technology, Chinese Academy of Sciences, Shanghai, 200083, China
Yu Yang
- Chen Jun
You can also search for this author inPubMed Google Scholar
- Cai Lei
You can also search for this author inPubMed Google Scholar
- Liu Wei
You can also search for this author inPubMed Google Scholar
- Yu Yang
You can also search for this author inPubMed Google Scholar
Corresponding author
Correspondence toChen Jun.
Ethics declarations
Consent for Publication
The work described has not been published before, and its publication has been approved by the responsible authorities at the institution where the work is carried out.
Competing interests
The authors declare that there is no competing interests regarding the publication of this article.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Jun, C., Lei, C., Wei, L.et al. Fusion of near-infrared and visible images based on saliency-map-guided multi-scale transformation decomposition.Multimed Tools Appl82, 34631–34651 (2023). https://doi.org/10.1007/s11042-023-14709-2
Received:
Revised:
Accepted:
Published:
Issue Date:
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative