CHEN Xin. Infrared and Visible Image Fusion Using Double Attention Generative Adversarial Networks[J]. Infrared Technology , 2023, 45(6): 639-648.
Citation: CHEN Xin. Infrared and Visible Image Fusion Using Double Attention Generative Adversarial Networks[J]. Infrared Technology , 2023, 45(6): 639-648.

Infrared and Visible Image Fusion Using Double Attention Generative Adversarial Networks

More Information
  • Received Date: August 22, 2022
  • Revised Date: September 13, 2022
  • In this study, an infrared and visible image fusion using double attention generative adversarial networks(DAGAN) is proposed to address the issue of most infrared and visible light image fusion methods based on GaN using only the attention mechanism in the generator and lacking the attention perception ability in the identification stage. Using DAGAN, a multi-scale attention module that combines spatial and channel attentions in different scale spaces and applies it in the image generation and discrimination stages such that both the generator and discriminator can identify the most discriminative region in the image, was proposed. Simultaneously, an attention loss function that uses the attention map in the discrimination stage to calculate the attention loss and save more target and background information was proposed. The TNO test of a public dataset shows that, compared with the other seven fusion methods, DAGAN has the best visual effect and the highest fusion efficiency.
  • [1]
    董安勇, 杜庆治, 苏斌, 等. 基于卷积神经网络的红外与可见光图像融合[J]. 红外技术, 2020, 42(7): 660-669. http://hwjs.nvir.cn/article/id/hwjs202007009

    DONG Anyong, DU Qingzhi, SU Bin, et al. Infrared and visible image fusion based on convolutional neural network[J]. Infrared Technology, 2020, 42(7): 660-669. http://hwjs.nvir.cn/article/id/hwjs202007009
    [2]
    罗迪, 王从庆, 周勇军. 一种基于生成对抗网络与注意力机制的可见光和红外图像融合方法[J]. 红外技术, 2021, 43(6): 566-574. http://hwjs.nvir.cn/article/id/3403109e-d8d7-45ed-904f-eb4bc246275a

    LUO Di, WANG Congqing, ZHOU Yongjun. A visible and infrared image fusion method based on generative adversarial networks and attention mechanism[J]. Infrared Technology, 2021, 43(6): 566-574. http://hwjs.nvir.cn/article/id/3403109e-d8d7-45ed-904f-eb4bc246275a
    [3]
    CHEN R, XIE Y, LUO X, et al. Joint-attention discriminator for accurate super-resolution via adversarial training[C]//Proceedings of the 27th ACM International Conference on Multimedia, 2019: 711-719.
    [4]
    LIU N, HAN J, YANG M-H. Picanet: pixel-wise contextual attention learning for accurate saliency detection[J]. IEEE Transactions on Image Processing, 2020, 29: 6438-6451. DOI: 10.1109/TIP.2020.2988568
    [5]
    CHEN J, WAN L, ZHU J, et al. Multi-scale spatial and channel-wise attention for improving object detection in remote sensing imagery[J]. IEEE Geoscience and Remote Sensing Letters, 2019, 17(4): 681-685.
    [6]
    ZHOU B, Khosla A, Lapedriza A, et al. Learning deep features for discriminative localization[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2016: 2921-2929.
    [7]
    Zagoruyko S, Komodakis N. Paying more attention to attention: improving the performance of convolutional neural networks via attention transfer[J/OL]. arXiv preprint arXiv: 161203928, 2016, 1: (https://doi.org/10.48550/arXiv.1612.03928).
    [8]
    Gulrajani I, Ahmed F, Arjovsky M, et al. Improved training of wasserstein GANs[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems, 2017: 5769-5779.
    [9]
    Alexander Toet. The tno multiband image data collection[J]. Journal Data in Brief, 2017, 15: 249-251. DOI: 10.1016/j.dib.2017.09.038
    [10]
    MA J, YU W, LIANG P, et al. FusionGAN: a generative adversarial network for infrared and visible image fusion[J]. Information Fusion, 2019, 48: 11-26. DOI: 10.1016/j.inffus.2018.09.004
    [11]
    Burt P, Adelson E. The Laplacian pyramid as a compact image code[J]. IEEE Transactions on Communications, 1983, 31(4): 532-540. DOI: 10.1109/TCOM.1983.1095851
    [12]
    Toet A. Image fusion by a ratio of low-pass pyramid[J]. Pattern Recognition Letters, 1989, 9(4): 245-253. DOI: 10.1016/0167-8655(89)90003-2
    [13]
    Lewis J J, O'Callaghan R J, Nikolov S G, et al. Pixel-and region-based image fusion with complex wavelets[J]. Information Fusion, 2007, 8(2): 119-130. DOI: 10.1016/j.inffus.2005.09.006
    [14]
    Chipman L J, Orr T M, Graham L N. Wavelets and Image Fusion[C]// International Conference on Image Processing of IEEE, 1995: 248-251.
    [15]
    Nencini F, Garzelli A, Baronti S, et al. Remote sensing image fusion using the curvelet transform[J]. Information Fusion, 2007, 8(2): 143-156. DOI: 10.1016/j.inffus.2006.02.001
    [16]
    Adu J, GAN J, WANG Y, et al. Image fusion based on nonsubsampled contourlet transform for infrared and visible light image[J]. Infrared Physics & Technology, 2013, 61: 94-100.
    [17]
    Roberts J W, Van Aardt J A, Ahmed F B. Assessment of image fusion procedures using entropy, image quality, and multispectral classification[J]. Journal of Applied Remote Sensing, 2008, 2(1): 023522. DOI: 10.1117/1.2945910
    [18]
    SHI W, ZHU C, TIAN Y, et al. Wavelet-based image fusion and quality assessment[J]. International Journal of Applied Earth Observation and Geoinformation, 2005, 6(3-4): 241-251. DOI: 10.1016/j.jag.2004.10.010
    [19]
    QU G, ZHANG D, YAN P. Information measure for performance of image fusion[J]. Electronics Letters, 2002, 38(7): 313-315.
    [20]
    HE L I, LEI L, CHAO Y, et al. An improved fusion algorithm for infrared and visible images based on multi-scale transform[J]. Semiconductor Optoelectronics, 2016, 74: 28-37.
    [21]
    MA J, YU W, LIANG P, et al. FusionGAN: a generative adversarial network for infrared and visible image fusion[J]. Information Fusion, 2019, 48: 11-26.
  • Related Articles

    [1]LI Jianghui. A Method and System for Infrared Image Simulation Based on ModelSim[J]. Infrared Technology , 2024, 46(7): 802-806.
    [2]WANG Xia, ZHAO Jiabi, SUN Qiyang, JIN Weiqi. Performance Evaluation Model for Infrared Polarization Imaging System[J]. Infrared Technology , 2023, 45(5): 437-445.
    [3]KONG Derui, XIA Ming, LI Haiying, CHEN Jun, ZHAO Peng. Theoretical Analysis and Matlab Simulation of Dynamic Vibration Absorber for Single-Piston Linear Compressor[J]. Infrared Technology , 2021, 43(10): 1014-1021.
    [4]ZHANG Jingyang, YAN Limin, CHEN Zhiheng. Nighttime Fog Removal Using the Dark Point Light Source Model[J]. Infrared Technology , 2021, 43(8): 798-803.
    [5]HU Yang, CHEN Cheng, HUA Sangtun, QIU Yafeng. Thermal Calculation of Countercurrent Cooling Tower and Design of Infrared Thermal Image Temperature Control System[J]. Infrared Technology , 2021, 43(3): 225-229.
    [6]PAN Hao, MA Yi, ZHOU Fangrong, MA Yutang, QIAN Guochao, WEN Gang. Research on the Theoretical Model Between Solar-blind UV and Atmospheric Temperature during Atmospheric Transmission[J]. Infrared Technology , 2020, 42(10): 1007-1012.
    [7]HAN Kun, YAO Ze, QIAO Kai, YANG Shuning, HE Yingping. Theoretical Model of Dynamic MTF of Low-Light-Level ICCD[J]. Infrared Technology , 2020, 42(3): 294-299.
    [8]SUN Jianning, SI Shuguang, WANG Xingchao, JIN Muchun, LI Dong, REN Ling, HOU Wei, ZHAO Min, GU Ying, QIAO Fangjian, ZHANG Haoda, CAO Yiqi. Preparation Method of K2CsSb Photocathode Using the Reflectance Theory Model[J]. Infrared Technology , 2017, 39(12): 1087-1091.
    [9]ZHANG Yao-jun, WU Gui-ling, LI Lei. Fusion for Infrared and Visible Light Images Based on Shearlet Transform and Quantum Theory Model[J]. Infrared Technology , 2015, (5): 418-423.
    [10]Theoretic Module of Uncooled IR Detector Performance Improvement[J]. Infrared Technology , 2002, 24(4): 31-34. DOI: 10.3969/j.issn.1001-8891.2002.04.009
  • Cited by

    Periodical cited type(1)

    1. 邱祥彪,杨晓明,孙建宁,王健,丛晓庆,金戈,曾进能,张正君,潘凯,陈晓倩. 高空间分辨微通道板现状及发展. 红外技术. 2024(04): 460-466 . 本站查看

    Other cited types(0)

Catalog

    Article views (244) PDF downloads (64) Cited by(1)
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return