Panchromatic and Multispectral Images Fusion Method Based on Detail Information Extraction
-
摘要: 全色(Panchromatic, Pan)图像与多光谱(Multi-spectral, MS)图像融合的目的是生成具有高空间分辨率的多光谱图像。为了进一步提升融合图像的质量,提出一种基于细节信息提取的融合方法。首先,使用滚动引导滤波器与差值运算分别获取Pan与MS的高频分量。其次,采用自适应强度-色度-饱和度(Adaptive Intensity-Hue-Saturation,AIHS)变换处理MS的高频分量与经像素显著性检测后Pan的高频分量,生成对应的强度分量(Intensity,I),再将Pan与I作差值运算获取细节图像。接着,采用引导滤波器计算Pan与MS的高频分量的差值,得到残差图像。最后,利用最速下降法将细节图像与残差图像注入到原始的MS图像中获得最终融合结果。实验结果表明,本文所提算法得到的融合图像能够取得较好的主观视觉效果,且客观定量评价指标较优。Abstract: The fusion of panchromatic images (Pan) and multi-spectral images (MS) is designed to generate multi-spectral images with high spatial resolution. A fusion method based on detailed information extraction is proposed to improve the quality of the fused images. First, the high-frequency components of Pan and MS are obtained by a rolling guidance filter and margin calculation, respectively. Second, the adaptive intensity-hue-saturation (AIHS) transform is used to process the high-frequency components of MS and Pan, determined by the pixel significance, to generate the corresponding intensity component (I, intensity). Then, the difference between Pan and I is calculated to obtain the detailed image. Then, the residual image is obtained by calculating the difference between the high-frequency components of Pan and MS with a guided filter. Finally, the detailed and residual images are integrated with the original MS image using the steepest descent method to obtain the final fusion result. The experimental results demonstrate that the fused images obtained by the proposed algorithm can achieve better subjective visual effect. Simultaneously, the objective evaluation indicators are better.
-
-
表 1 WV3-Ⅰ的定量评价结果
Table 1 Quantitative evaluation results of WV3
Methods Metric CC SSIM RASE ERGAS SAM UIQI SCC RMSE PSNR Reference 1 1 0 0 0 1 1 0 +∞ BT 0.8520 0.6323 27.5009 6.9281 0.0363 0.8451 0.6267 23.9945 20.5286 IHS 0.8509 0.5093 48.6246 21.4512 0.1710 0.6957 0.6160 42.4249 15.5784 CBD 0.8801 0.6437 27.7469 7.0537 0.0523 0.8746 0.6261 24.2091 20.4512 SFIM 0.8987 0.6397 23.6789 6.0072 0.0388 0.8986 0.6121 20.6598 21.8283 BDSD 0.8907 0.6614 24.7900 6.2571 0.0466 0.8904 0.6420 21.6292 21.4300 DTCWT 0.8860 0.6449 24.2148 6.1894 0.0457 0.8825 0.6352 21.1273 21.6339 NSCT-SR 0.8560 0.6278 26.9770 6.8812 0.0527 0.8491 0.6258 23.5374 20.6956 MTF-GLP 0.8941 0.6419 25.4094 6.5111 0.0513 0.8910 0.6486 22.1696 21.2156 Proposed 0.9076 0.6566 22.5488 5.7804 0.0493 0.9073 0.6548 19.6737 22.2531 表 2 WV3-Ⅱ的定量评价结果
Table 2 Quantitative evaluation results of WV3
Methods Metric CC SSIM RASE ERGAS SAM UIQI SCC RMSE PSNR Reference 1 1 0 0 0 1 1 0 +∞ BT 0.9730 0.8728 20.9863 5.2562 0.0627 0.9687 0.9397 14.5573 24.8692 IHS 0.9695 0.5631 63.2636 30.8796 0.2360 0.6860 0.9360 43.8831 15.2849 CBD 0.9763 0.8891 21.6071 5.4043 0.0849 0.9727 0.9400 14.9879 24.6160 SFIM 0.9814 0.8905 17.1809 4.3441 0.0635 0.9812 0.9282 11.9176 26.6070 BDSD 0.9758 0.8791 20.6105 5.182 0.0899 0.9745 0.9413 14.2966 25.0262 DTCWT 0.9815 0.8912 17.1121 4.2963 0.0677 0.9800 0.9406 11.8699 26.6419 NSCT-SR 0.9735 0.8720 20.7693 5.2093 0.0717 0.9697 0.9367 14.4067 24.9595 MTF-GLP 0.9816 0.9014 17.8018 4.4780 0.0776 0.9802 0.9383 12.3483 26.2987 Proposed 0.9838 0.9017 16.2089 4.0441 0.0841 0.9830 0.9368 11.2434 27.1129 表 3 GeoEye1的定量评价结果
Table 3 Quantitative evaluation results of GeoEye1
Methods Metric CC SSIM RASE ERGAS SAM UIQI SCC RMSE PSNR Reference 1 1 0 0 0 1 1 0 +∞ BT 0.7706 0.9285 9.7191 2.3714 0.0174 0.7663 0.9016 5.3156 33.6198 IHS 0.7049 0.7516 47.4709 26.5205 0.1517 0.4170 0.9170 25.9627 19.8438 CBD 0.9338 0.9128 7.1367 2.0122 0.0243 0.9220 0.8847 3.9032 36.3023 SFIM 0.9677 0.9575 4.0797 1.0938 0.0174 0.9666 0.9247 2.2312 41.1598 BDSD 0.9492 0.9631 5.3800 1.5113 0.0205 0.9479 0.9453 2.9424 38.7567 DTCWT 0.9207 0.9479 5.9575 1.5881 0.0184 0.9189 0.9246 3.2583 37.8711 NSCT-SR 0.8088 0.9349 9.0938 2.4113 0.0228 0.8041 0.9158 4.9736 34.1974 MTF-GLP 0.9750 0.9657 3.8028 1.0340 0.0155 0.9742 0.9432 2.0798 41.7704 Proposed 0.9783 0.9686 3.4253 0.9681 0.0155 0.9782 0.9466 1.8734 42.6783 表 4 VW2的定量评价结果
Table 4 Quantitative evaluation results of VW2
Methods Metric CC SSIM RASE ERGAS SAM UIQI SCC RMSE PSNR Reference 1 1 0 0 0 1 1 0 +∞ BT 0.8979 0.6623 18.2241 4.5660 0.0564 0.8822 0.8795 61.312 12.3799 IHS 0.8978 0.5226 47.6344 24.0439 0.1574 0.6827 0.8810 160.2582 4.0344 CBD 0.9499 0.7216 13.8956 3.5849 0.0490 0.9477 0.9047 46.7494 14.7353 SFIM 0.9502 0.7207 13.2566 3.3281 0.0564 0.9479 0.8930 44.5996 15.1442 BDSD 0.9487 0.7267 13.6076 3.5028 0.0495 0.9487 0.9162 45.7805 14.9172 DTCWT 0.9392 0.6876 14.8766 3.8229 0.0573 0.9307 0.8856 50.0500 14.1427 NSCT-SR 0.9033 0.6649 18.0106 4.6550 0.0616 0.8883 0.8800 60.5936 12.4823 MTF-GLP 0.9489 0.7100 14.084 3.6470 0.0578 0.9440 0.8901 47.3833 14.6183 Proposed 0.9580 0.7395 12.2714 3.1650 0.0460 0.9557 0.9123 41.2852 15.8149 -
[1] 王芬, 郭擎, 葛小青. 深度递归残差网络的遥感图像空谱融合[J]. 遥感学报, 2021, 25(6): 1244-1256. https://www.cnki.com.cn/Article/CJFDTOTAL-YGXB202106004.htm WANG F, GUO Q, GE X Q. Pan-sharpening by deep recursive residual network[J]. National Remote Sensing Bulletin, 2021, 25(6): 1244-1256. https://www.cnki.com.cn/Article/CJFDTOTAL-YGXB202106004.htm
[2] 李树涛, 李聪妤, 康旭东. 多源遥感图像融合发展现状与未来展望[J]. 遥感学报, 2021, 25(1): 148-166. https://www.cnki.com.cn/Article/CJFDTOTAL-YGXB202101010.htm LI S T, LI C Y, KANG X D. Development status and future prospects of multi-source remote sensing image fusion[J]. National Remote Sensing Bulletin, 2021, 25(1): 148-166. https://www.cnki.com.cn/Article/CJFDTOTAL-YGXB202101010.htm
[3] MENG X, SHEN H, LI H, et al. Review of the pansharpening methods for remote sensing images based on the idea of meta-analysis: practical discussion and challenges[J]. Information Fusion, 2019, 46: 102-113. DOI: 10.1016/j.inffus.2018.05.006
[4] 陈毛毛, 郭擎, 刘明亮, 等. 密集卷积残差网络的遥感图像融合[J]. 遥感学报, 2021, 25(6): 1270-1283. https://www.cnki.com.cn/Article/CJFDTOTAL-YGXB202106006.htm CHEN M M, GUO Q, LIU M L, et al. Pan-sharpening by residual network with dense convolution for remote sensing images[J]. National Remote Sensing Bulletin, 2021, 25(6): 1270-1283. https://www.cnki.com.cn/Article/CJFDTOTAL-YGXB202106006.htm
[5] YANG Y, LU H, HUANG S, et al. Pansharpening based on joint-guided detail extraction[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2021, 14: 389-401. DOI: 10.1109/JSTARS.2020.3032472
[6] YANG Y, WAN C, HUANG S, et al. Pansharpening based on low-rank fuzzy fusion and detail supplement[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2020, 13: 5466-5479. DOI: 10.1109/JSTARS.2020.3022857
[7] TU T M, HUANG P S, HUNG C L, et al. A fast intensity–hue–saturation fusion technique with spectral adjustment for IKONOS imagery[J]. IEEE Geoscience and Remote Sensing Letters, 2004, 1(4): 309-312. DOI: 10.1109/LGRS.2004.834804
[8] 陈应霞, 陈艳, 刘丛. 遥感影像融合AIHS转换与粒子群优化算法[J]. 测绘学报, 2019, 48(10): 1296-1304. https://www.cnki.com.cn/Article/CJFDTOTAL-CHXB201910011.htm CHEN Y X, CHEN Y, LIU C. Joint AIHS and particles swarm optimization for pan-sharpening[J]. Acta Geodaetica et Cartographica Sinica, 2019, 48(10): 1296-1304. https://www.cnki.com.cn/Article/CJFDTOTAL-CHXB201910011.htm
[9] Rahmani S, Strait M, Merkurjev D, et al. An adaptive IHS pan-sharpening method[J]. IEEE Geoscience & Remote Sensing Letters, 2010, 7(4): 746-750.
[10] 王威, 张佳娥. 基于引导滤波和shearlet稀疏的遥感图像融合算法[J]. 计算机工程与科学, 2018, 40(7): 1250-1255. https://www.cnki.com.cn/Article/CJFDTOTAL-JSJK201808017.htm WANG W, ZHANG J E. A remote sensing image fusion algorithm based on guided filtering and shearlet sparse base[J]. Computer Engineering and Science, 2018, 40(7): 1250-1255. https://www.cnki.com.cn/Article/CJFDTOTAL-JSJK201808017.htm
[11] JIAN L, YANG X, WEI W, et al. Pansharpening using a guided image filter based on dual-scale detail extraction[J/OL]. Journal of Ambient Intelligence and Humanized Computing, 2018: 1-15, https://doi.org/ 10.1007/s12652-018-0866-4.
[12] ZHANG Q, SHEN X, XU L, et al. Rolling guidance filter[C]//European Conference on Computer Vision, 2014: 815-830.
[13] 陈峰, 李敏, 马乐, 等. 基于滚动引导滤波的红外与可见光图像融合算法[J]. 红外技术, 2020, 42(1): 54-61. http://hwjs.nvir.cn/article/id/hwjs202001008 CHEN F, LI M, MA L, et al. Infrared and visible image fusion algorithm based on the rolling guidance filter[J]. Infrared Technology, 2020, 42(1): 54-61. http://hwjs.nvir.cn/article/id/hwjs202001008
[14] LI Q, YANG X, WU W, et al. Pansharpening multispectral remote‐sensing images with guided filter for monitoring impact of human behavior on environment[J]. Concurrency and Computation: Practice and Experience, 2018, 33(4): e5074.
[15] Kalpoma K A, Kawano K, Kudoh J I. IKONOS image fusion process using steepest descent method with bi-linear interpolation[J]. International Journal for Remote Sensing, 2013, 34(1-2): 505-518.
[16] Wald L, Ranchin T, Mangolini M. Fusion of satellite images of different spatial resolutions: assessing the quality of resulting images[J]. Photogrammetric Engineering and Remote Sensing, 1997, 63(6): 691-699.