Review of Polarization Image Fusion Based on Deep Learning
-
摘要: 偏振图像融合旨在通过光谱信息和偏振信息的结合改善图像整体质量,在图像增强、空间遥感、目标识别和军事国防等领域具有广泛应用。本文在回顾基于多尺度变换、稀疏表示和伪彩色等传统融合方法基础上,重点介绍基于深度学习的偏振图像融合方法研究现状。首先阐述基于卷积神经网络和生成对抗网络的偏振图像融合研究进展,然后给出在目标检测、语义分割、图像去雾和三维重建领域的相关应用,同时整理公开的高质量偏振图像数据集,最后对未来研究进行展望。Abstract: Polarization image fusion improves overall image quality by combining spectral and polarization information. It is used in different fields, such as image enhancement, spatial remote sensing, target identification and military defense. In this study, based on a review of traditional fusion methods using multi-scale transform, sparse representation, pseudo-coloration, etc. we focus on the current research status of polarization image fusion methods based on deep learning. First, the research progress of polarization image fusion based on convolutional neural networks and generative adversarial networks is presented. Next, related applications in target detection, semantic segmentation, image defogging, and three-dimensional reconstruction are described. Some publicly available high-quality polarization image datasets are collated. Finally, an outlook on future research is presented.
-
Keywords:
- polarization image /
- image fusion /
- deep learning
-
0. 引言
近年来,随着红外制冷探测器朝尺寸小、重量轻、功耗低、成本低即低SWaP-C(size, weight and power, cost)方向快速发展,集成小型制冷机的该类中波红外探测器国外已大量应用于武器热瞄镜、便携式手持热像仪、小型无人机、无人车、遥控狙击手和遥控武器站、导弹导引头等空间受限的红外系统[1]。针对该类小型探测器设计一款仅由4片透镜组成的尺寸小、重量轻、成本低的中波红外连续变焦光学系统进而生产结构紧凑、低功耗和低成本的红外变焦热像仪将在手持观瞄具、边防监视系统、小型无人系统等平台得到广泛应用。
目前采用四片式的红外变焦光学多为双视场变焦系统。文献[2]为四片式非制冷双视场,文献[3]为四片制冷型长波双视场系统,文献[4]为4片制冷型中波双视场系统,目前仅采用4片透镜实现连续变焦的制冷型红外光学系统未见报道。
从变焦理论分析,一般常用的机械补偿变焦系统由典型的前固定组、变倍组、补偿组、后固定组4组透镜组成。变倍组一般是负透镜,而补偿组可以是正透镜组也可以是负透镜组,前者为正组补偿系统,后者称为负组补偿系统。从光学系统像差校正难易程度、减少透镜数量、降低光学透镜成本考虑,本文采用无后固定组正组补偿变焦系统,即变焦部分由会聚目标光线的前固定组正透镜、变倍负透镜、补偿正透镜构成。为压缩前固定组物镜口径并满足100%冷屏效率,系统采用二次成像方案,利用单片正光焦度透镜将一次像再次中继成像到探测器焦平面。为压缩轴向尺寸,采用二片平面反射镜将光路U型折转,最终实现仅由4片透镜构成的轻小型中波红外连续变焦光学系统。
1. 连续变焦理论计算模型
机械补偿连续变焦光学系统参数求解就是确定变焦系统在满足像面稳定和焦距在一定范围内连续变化的条件下系统中各组元的焦距、间隔、位移量等参数。通过建立数学模型能方便地计算和分析变焦过程、确定变焦系统高斯光学参数[5]。二组元正组补偿连续变焦系统运动方式如图 1所示。
变焦系统由于只有运动组才产生像面位移,只需抽出变倍组、补偿组加以分析。
因变倍组f2′的移动,引起整个运动组分的像面移动为m32(1-m22)dq,因补偿组f3′的移动,引起整个运动组分的像面移动为(1-m32)dΔ。为达到像面稳定,两个运动组像面移动量的代数和必为零。
$$ m_3^2\left(1-m_2^2\right) \mathrm{d} q+\left(1-m_3^2\right) \mathrm{d} \varDelta=0$$ (1) 而变倍组f2′、补偿组f3′微分移动量dq、dΔ与其倍率变化dm2、dm3之间的关系为:
$$ \mathrm{d} q = \frac{{{f_2}'}}{{m_2^2}}\mathrm{d}{m_2} $$ (2) $$ \mathrm{d} \varDelta=f_3{ }^{\prime} \mathrm{d} m_3 $$ (3) 将(2)、(3)代入(1),经整理得到二组元连续变焦微分方程如下:
$$ \frac{{1 - m_2^2}}{{m_2^2}}{f_2}'\mathrm{d}{m_2} + \frac{{1 - m_3^2}}{{m_3^2}}{f_3}'\mathrm{d}{m_3} = 0 $$ (4) 式(4)是多变量全微分型微分方程,设U(m2, m3)为原函数,则有dU(m2, m3)=0。
其通解为:
$$ U({m_2}, {m_3}) = {f_2}'(\frac{1}{{{m_2}}} + {m_2}) + {f_3}'(\frac{1}{{{m_3}}} + {m_3}) = C $$ (5) 式中:C为常量;设变倍组f2′、补偿组f3′初始状态都处于系统长焦位置,则:
$$ m_{2}=m_{2l}\text{;}m_{3}=m_{3l} $$ (6) 同样有:
$$ {f_2}'(\frac{1}{{{m_{2l}}}} + {m_{2l}}) + {f_3}'(\frac{1}{{{m_{3l}}}} + {m_{3l}}) = C $$ (7) 消去常量C,得到方程的特解:
$$ {f_2}'(\frac{1}{{{m_2}}} - \frac{1}{{{m_{2l}}}} + {m_2} - {m_{2l}}) + {f_3}'(\frac{1}{{{m_3}}} - \frac{1}{{{m_{3l}}}} + {m_3} - {m_{3l}}) = 0 $$ (8) 将(8)式整理得到补偿组f3′的倍率m3构成的二次方程:
$$ m_3^2 - b{m_3} + 1 = 0 $$ (9) 其中
$$ b = - \frac{{{f_2}'}}{{{f_3}'}}(\frac{1}{{{m_2}}} - \frac{1}{{{m_{2l}}}} + {m_2} - {m_{2l}}) + (\frac{1}{{{m_{3l}}}} + {m_{3l}}) $$ (10) 解得m3的两根为:
$$ {m_{31}} = \frac{{b + \sqrt {{b^2} - 4} }}{2} $$ (11) $$ {m_{32}} = \frac{{b - \sqrt {{b^2} - 4} }}{2} $$ (12) 系统参数求解过程如下:
1)将(2)式积分并整理得到变倍组f2′的倍率m2;
$$ {m_2} = \frac{1}{{\frac{1}{{{m_{2l}}}} + \frac{{{q_2}}}{{{f_2}'}}}} $$ (13) 2)根据求得的m2按照公式(10)求出系数b,再由(11)、(12)式解得补偿组f3′满足运动方程的两个解m31和m32。
3)根据补偿组的两个解求出满足运动方程补偿像面位移的移动量Δ1和Δ2:
$$ \varDelta_{1}=f_{3}′(m_{31}-m_{3l})$$ (14) $$ \varDelta_{2}=f_{3}′(m_{32}-m_{3l}) $$ (15) 4)求出系统的总变倍比
$$ {\varGamma _1} = \frac{{{m_{2l}}{m_{3l}}}}{{{m_2}{m_{31}}}} $$ (16) $$ {\varGamma _2} = \frac{{{m_{2l}}{m_{3l}}}}{{{m_2}{m_{32}}}} $$ (17) 变倍组f2′每移动q2对应着Γ1和Γ2,变倍组和补偿组一起同步运动直到预定的总变倍比为止,到达变倍比的要求的最终状态为系统短焦位置。根据上述连续变焦微分模型,利用(10)~(17)式,可解得各组分光焦度分配及间隔位置关系。
2. 连续变焦系统设计流程
针对复杂的变焦光学系统,建立模型是至关重要的环节,尽管根据(10)~(17)式可推导求解变焦系统高斯参数,但是当光学系统初始参数选择不合适, 会使得系统光焦度分配不合理导致系统像差校正难度大、间隔不合适导致各组件运动中相互碰撞等情况发生。鉴于连续变焦系统的复杂性,系统建模难度大,因此根据连续变焦理论模型编制连续变焦参数计算程序,辅助建立理想光学模型。建模后,设计工作的重点将放在选型以及评价函数的设置和动态修改上,使得设计系统快速收敛[6]。连续变焦光学系统设计流程如图 2所示。
首先,根据设计指标要求需要建立理想光学模型,确定每个组元的参数;再合理选型选材、设定评价函数,进入优化和全局优化;最后根据设计结果评价成像质量。其中函数优化和像质评价环节反复多次迭代,直至达到设计技术指标要求。
3. 连续变焦光学系统设计
3.1 设计指标
中波制冷型连续变焦光学系统采用昆明物理研究所研制生产的长度方向仅119 mm的小型化制冷型中波红外640×512焦平面探测器组件,该探测器具体参数如表 1所示。连续变焦光学系统主要设计指标见表 2。
表 1 探测器参数Table 1. Parameters of detectorDetector HgCdTe Array size 640×512 Pixel dimension/μm 15 NETD/mK ≤22 Spectral response/μm 3.7−4.8 μm Weight ≤380 g 表 2 光学系统设计指标Table 2. Parameters of optical system designWorking waveband/μm 3.7 to 4.8 Zoom 10:01 Field of view 20°×16° to 2.0°×1.6° F# 4 Focal length/mm 27.0~275 Working temperature/℃ -40 to 60 3.2 设计过程
首先,按照系统指标要求,根据连续变焦理论模型编制连续变焦正组补偿参数计算程序求解变焦光学系统各组元光焦度分配及位置间隔关系。在系统初始参数取值上主要考虑以下几点:
1)系统采用二次成像,既可以压缩前固定组直径又可以满足100%冷屏效率。中继组初始倍率取为-1×,前端变焦核心按照系统实际变焦范围进行取值,无需缩放;
2)系统无后固定组,在理论求解中将后固定组倍率取为1×即将后固定组定为无光焦度的虚拟面;
3)为压缩系统变焦过程中变倍组、补偿组位移量,变倍组长焦初始倍率取较大的倍率值,以符合最速变焦理论;
4)考虑光路U型折转,补偿组和中继组之间需较大的空间安置两片平面反射镜,则补偿组取较大的焦距值;
5)系统处于短焦位置时前固定组与变倍组应留出足够的间隔,使两组透镜不至于相碰,初始值设为0.55,补偿组与无光焦度后固定组虚拟面距离初始设为0.55。
利用计算程序,通过反复调整系统初始参数,观察组元间隔、光焦度分配是否合适,最终确定系统初始值为:
$$ \begin{gathered} f_{2}′=-1、f_{3}′=1.62、m_{2l}=-1.45、\\m_{3l}=-1.34、d_{12d}=0.55、d_{34d}=0.55 \end{gathered} $$ 表 3为连续变焦光学系统参数简易计算程序按照上述初始值计算得到的5个视场位置的变焦间隔参数分配结果。
表 3 变焦系统初始间隔参数Table 3. Initial spacing parameters of optical systemFocal length/mm 275 215 150 78.6 25.9 f1/f2 spacing/mm 83.49 80.06 74.56 59.44 18.90 f2/f3 spacing/mm 13.02 24.01 38.57 68.69 125.94 f3/f4 spacing/mm 67.23 59.67 50.61 35.61 18.90 其次,将上述程序计算的变焦光学系统各组元光焦度及间隔位置参数输入ZEMAX光学设计程序,得到系统近轴光学结构,如图 3所示。通过近轴光学结构分析,系统在各位置的焦距值与计算值吻合、总长一致、间隔布局合理、变焦曲线连续,验证程序计算结果正确,可进入下一步选型工作。
再次,考虑各组元材料与调焦镜的选取,由于系统需满足-40℃~+60℃工作环境下成像清晰要求,按照光学系统无热化设计理论,系统的光焦度分配、材料选取、元件间隔都要满足光焦度、消色差、消热差3个方程[7]。
$$ \sum\limits_{i = 1}^j {{h_i}{\varphi _i}} = \varphi $$ (18) $$ {\left( {\frac{1}{{{h_1}\varphi }}} \right)^2}\sum\limits_{i = 1}^j {h_i^2{\varphi _i}{\theta _i}} = 0 $$ (19) $$ {\left( {\frac{1}{{{h_1}\varphi }}} \right)^2}\sum\limits_{i = 1}^j {h_i^2{\varphi _i}{\chi _i}} = \sum\limits_{i = 1}^j {{\alpha _i}{L_i}} $$ (20) 式中:hi、ϕi、θi、χi分别为系统各透镜组近轴光线高度、光焦度、色差系数及热差系数;h1为第一透镜近轴光线高;ϕ为系统总光焦度;αi为各透镜间隔镜筒材料的线膨胀系数;Li为各间隔镜筒长度。
该系统采用分步设计技术实现系统连续变焦无热化。首先选取满足上述(18)(19)公式的光焦度和材料分配,实现系统常温状态连续变焦成像清晰及高低温情况下离焦量的线性变化,系统光学材料主要选用硅单晶、锗单晶及硒化锌。其次利用主动补偿技术使系统满足(20)式要求,由于系统透镜数量少,变倍组、补偿组采用凸轮轨道变焦,中继组的轴向移动会产生变倍效果即系统视场焦距发生变化,因此采用前固定组轴向移动来进行主动调焦消热。通过上述无热化设计方法,光学系统在-40℃~+60℃温度范围内保持其性能基本不变。
最后,设置多重结构,合理设计像差评价函数,不断调整优化。为提升连续变焦系统各视场成像清晰度要求,通过设置多个高次非球面和二元衍射面,以提供更多的自由度,有利于球差、色差、像散等各类像差的校正。
3.3 设计结果
轻小型中波红外连续变焦光学系统最终设计结果如图 4所示,从上到下依次为长焦275 mm、中焦100 mm、短焦27 mm的系统图。
系统前固定组采用硅单晶材料用于会聚目标景物光线、压缩变倍组透镜尺寸;变倍组采用锗单晶材料,利用锗单晶高折射率、高色散的特性实现大倍率的变焦;补偿组采用硒化锌材料主要利用其较低的温度折射率系数使其在高低温工作环境只产生较小的离焦量以便实时补偿;中继组选择具有低的温度折射率系数、低色散的硅单晶材料平衡前端变焦核的残留色差。
整个光学系统由4片透镜、两个平面反射镜组成,其中最大透镜为第一透镜其加工直径为71 mm,平面反射镜U型折转后光学系统轴向尺寸长度172 mm,横向尺寸宽度108 mm,光学零件总重量为64 g。该系统光学透镜数量少、重量轻,光路紧凑体积小,冷屏效率100%,适配小型化制冷探测器符合连续变焦光学系统轻小型设计理念。
3.4 二元衍射面分析
由于二元衍射面在消热差、消色差方面的优异特性,系统采用了两个二元衍射面用于减少透镜数量、简化系统设计、提高系统成像质量。
在补偿组透镜硒化锌材料上引入的二元衍射面参数为Norm Radius=15 mm,A1=-33.416,A2=4.580。经计算得到二元衍射面环带深度随透镜径向的变化如图 5所示。硒化锌二元面环带数为4,最大环带深度2.918 μm,最小环带间隔宽度为2.06 mm。该面型环带间隔宽、加工环带数量少,易于单点金刚石车削加工。
在中继组硅单晶材料引入的二元衍射面参数为Norm Radius=16 mm,A1=-60.402,A2=-1.254。其环带深度随透镜径向的变化如图 6所示。硅透镜二元面环带数为9,最大环带深度1.72 μm,最小环带间隔宽度为0.86 mm。该透镜由于材料硬、环带多相对加工难度大,目前昆明物理所光学中心采用单点金刚石车削加工工艺,能制造出满足指标要求的硅基底二元光学元件。
3.5 凸轮曲线计算
本文应用动态光学理论[8],根据像移补偿公式计算补偿组运动曲线。由于变焦组和补偿组均为沿光轴的一维移动,稳像方程为:
$$ {\beta _{2m}}{\beta _2}(1 - {\beta _{1m}}{\beta _1}){q_1} + (1 - {\beta _{2m}}{\beta _2}){q_2} = 0 $$ (21) 式中:β1为变倍组初始位置的垂轴放大率;β1m为变倍组运动后的垂轴放大率;β2为补偿组初始位置的垂轴放大率;β2m补偿组运动后的垂轴放大率;q1为变倍组沿光轴位移量;q2为补偿组沿光轴位移量。
式中:
$$ {\beta _{1m}} = \frac{{{\beta _1}{f_1}'}}{{{f_1}' - {\beta _1}{q_1}}} $$ (22) $$ {\beta _{2m}} = \frac{{{\beta _2}{f_2}'}}{{{f_2}' + (1 - {\beta _1}{\beta _{1m}}){\beta _2}{q_2} - {\beta _2}{q_2}}} $$ (23) 由(21)式~(23)式可得出q1和q2的运动关系,即:
$$ Aq_{2}^{2}+Bq_{2}+C=0$$ (24) 式中:
$$ A=(f_{1}′-β_{1}q_{1})β_{2}\text{;} $$ $$ \begin{align} B=&β_{1}β_{2}q_{1}^{2}+[f_{2}′(1-β_{2}^{2})β_{1}-f_{1}′(1-β_{1}^{2})β_{2}]q_{1}- \\&f_{1}′f_{2}′(1-β_{1}^{2}) \end{align}$$ $$ C=β_{2}^{2}f_{2}′[β_{1}q_{1}-f_{1}′(1-β_{1}^{2})]q_{1} $$ 得到:
$$ {q_2} = \frac{{ - B \pm \sqrt {{B^2} - 4AC} }}{{2A}} $$ (25) 根据上述求解公式计算该变焦系统凸轮曲线如图 7所示。变倍组最大行程为28.6 mm、补偿组最大行程35 mm;补偿组曲线变化平滑,有利于凸轮轨道加工。
4. 系统像质评价
4.1 光学传递函数
系统3个焦距状态下的光学传递函数(MTF)如图 8所示。在3个焦距状态下的系统MTF满足使用要求,光学系统成像质量清晰。
4.2 点列图
系统3个焦距状态下的点列图如图 9所示。各视场RMS(root mean square)均小于一个像素,最大弥散斑RMS半径为12.6 μm,小于像元尺寸;最大弥散斑几何半径为24.5 μm,与系统艾里斑半径20.9 μm相当。系统成像质量良好,满足使用要求。
4.3 畸变
系统畸变情况如图 10所示,在长焦端小视场位置时,最大畸变量为1.2%,在短焦端大视场位置时的最大畸变量为2.4%,该变焦系统畸变对连续成像无明显影响。
4.4 系统高低温成像质量
在高低温工作环境中,系统采用轴向移动前固定组进行主动调焦消热。系统长焦端受环境温度变化,成像质量影响较大,本文主要分析长焦275 mm在高低温下经补偿后的系统成像质量。图 11为系统在高低温下长焦经补偿后的系统调制传递函数。图 12为系统在高低温下长焦经补偿后的系统点列图。从高低温传函图及点列图中看出系统在-40℃~60℃范围内成像质量良好,满足使用要求。
5. 结论
基于小型化制冷中波640×512、像元间距15 μm的焦平面探测器,设计一款具有SWaP-C特征的正组补偿连续变焦光学系统。系统由4片透镜两片平面反射镜组成,F#为4、视场变化范围为20°×16°~2.0°×1.6°,变倍比为10×、U型折叠后系统包络尺寸为172 mm×108 mm、最大物镜口径71 mm、光学零件总重量64 g、零件加工工艺成熟,变焦凸轮曲线平滑,在-40℃~60℃范围内保持较好的成像质量。该轻小型中波红外连续变焦光学系统在导航、搜索、跟踪、警戒、侦察等领域具有广阔的市场前景。
-
表 1 传统偏振图像融合方法对比
Table 1 Comparison of traditional polarization image fusion methods
Methods Specificities Advantages Shortcomings Multi-scale transformation Important visual information can be extracted at different scales and provide better spatial and frequency resolution. It is capable of extracting multi-scale details and structural information to effectively improve the quality of fused images. The determination of decomposition levels and the selection of fusion rules usually depend on manual experience. Sparse representation The method uses a linear subspace representation of training samples and is suitable for approximating similar objects. The method captures sparse features and highlights target details, retaining unique source image information. Dictionary training has some computational complexity and is more sensitive to noise and pseudo-features. Pulse coupled neural network It is composed of various neurons, including reception, modulation, and pulse generation, and it is suitable for real-time image processing. It can effectively detect the edge and texture features of the image, and the edge information fusion effect is relatively good. The implementation requires multiple iterative computations. It has high operational coupling, many parameters, and is time-consuming. Pseudo-color-based methods The method maps the gray levels of a black-and-white or monochrome image to a color space or assigns corresponding colors to different gray levels. Images of different bands can be mapped to pseudo-color space, thus visually representing multi-band information for easy observation and understanding. The main function is to colorize the image, which cannot extract and fuse more information, and the ability to retain detailed information is relatively weak. 表 2 基于CNN和GAN的偏振图像融合算法对比
Table 2 Comparison of polarization image fusion algorithms based on CNN and GAN
Methods Specificities Advantages Shortcomings CNN The complexity of the algorithm depends on the coding method and the design of fusion rules. The CNN-based fusion network has better feature learning and representation ability, which makes it more suitable for information extraction and feature fusion processes. CNN can automatically learn image features and patterns, which can simplify the process of algorithm design and implementation and greatly improve accuracy. It is widely used in the process of feature extraction and representation. The problem of overfitting may occur when training on small sample datasets. It may not be sensitive to the detailed information in the image and easily lose the details in the fusion process. Networks with deeper layers usually require a lot of computational resources and time. GAN The fusion process is modeled as an adversarial game between the generator and the discriminator. Through continuous learning optimization, the fusion result of the generator converges with the target image in terms of probability distribution. The feature extraction, fusion, and image reconstruction processes can be realized implicitly. The adversarial learning mechanism of the generator and discriminator enhances the realism and overall quality of the image fusion, better preserving the details and structural features of the source image. Training is performed unsupervised and usually does not require large amounts of labeled data. The training process is relatively unstable. The design and tuning process is relatively complex and requires reasonable selection and adjustment of the network architecture and loss function. It may lead to artifacts or unnatural problems in the generated images in some specific scenarios. 表 3 偏振图像数据集
Table 3 Polarization image dataset
Source Waveband Year Quantity Resolution Reference[60] Visible band(Grayscale) 2019 120 1280×960 Reference[61] Visible band(RGB) 2020 40 1024×768 Reference[62-63] Long-wave infrared band 2020 2113 640×512 Reference[47] Visible band(RGB) 2021 394 1224×1024 Reference[64] Visible band(RGB) 2021 66 1848×2048 Reference[65] Visible band(RGB) 2021 40 1024×1024 -
[1] LI S, KANG X, FANG L, et al. Pixel-level image fusion: a survey of the state of the art[J]. Information Fusion, 2017, 33: 100-112. DOI: 10.1016/j.inffus.2016.05.004
[2] ZHANG H, XU H, TIAN X, et al. Image fusion meets deep learning: a survey and perspective[J]. Information Fusion, 2021, 76: 323-336. DOI: 10.1016/j.inffus.2021.06.008
[3] 罗海波, 张俊超, 盖兴琴, 等. 偏振成像技术的发展现状与展望(特邀)[J]. 红外与激光工程, 2022, 51(1): 101-110. LUO Haibo, ZHANG Junchao, GAI Xingqin, et al. Development status and prospect of polarization imaging technology (Invited)[J]. Infrared and Laser Engineering, 2022, 51(1): 101-110.
[4] 周强国, 黄志明, 周炜. 偏振成像技术的研究进展及应用[J]. 红外技术, 2021, 43(9): 817-828. http://hwjs.nvir.cn/article/id/76230e4e-2d34-4b1e-be97-88c5023050c6 ZHOU Qiangguo, HUANG Zhiming, ZHOU Wei. Research progress and application of polarization imaging technology[J]. Infrared Technology, 2021, 43(9): 817-828. http://hwjs.nvir.cn/article/id/76230e4e-2d34-4b1e-be97-88c5023050c6
[5] 段锦, 付强, 莫春和, 等. 国外偏振成像军事应用的研究进展(上)[J]. 红外技术, 2014, 36(3): 190-195. http://hwjs.nvir.cn/article/id/hwjs201403003 DUAN Jin, FU Qiang, MO Chunhe, et al. Review of polarization imaging technology for international military application(Ⅰ)[J]. Infrared Technology, 2014, 36(3): 190-195. http://hwjs.nvir.cn/article/id/hwjs201403003
[6] 莫春和, 段锦, 付强, 等. 国外偏振成像军事应用的研究进展(下)[J]. 红外技术, 2014, 36(4): 265-270. http://hwjs.nvir.cn/article/id/hwjs201404002 MO Chunhe, DUAN Jin, FU Qiang, et al. Review of polarization imaging technology for international military application(Ⅱ)[J]. Infrared Technology, 2014, 36(4): 265-270. http://hwjs.nvir.cn/article/id/hwjs201404002
[7] 王霞, 赵家碧, 孙晶, 等. 偏振图像融合技术综述[J]. 航天返回与遥感, 2021, 42(6): 9-21. WANG Xia, ZHAO Jiabi, SUN Jing, et al. Review of polarization image fusion technology[J]. Aerospace Return and Remote Sensing, 2021, 42(6): 9-21.
[8] LI X, YAN L, QI P, et al. Polarimetric imaging via deep learning: a review[J]. Remote Sensing, 2023, 15(6): 1540. DOI: 10.3390/rs15061540
[9] YANG Fengbao, DONG Anran, ZHANG Lei, et al. Infrared polarization image fusion based on combination of NSST and improved PCA[J]. Journal of Measurement Science and Instrumentation, 2016, 7(2): 176-184.
[10] 杨风暴, 董安冉, 张雷, 等. DWT, NSCT和改进PCA协同组合红外偏振图像融合[J]. 红外技术, 2017, 39(3): 201-208. http://hwjs.nvir.cn/article/id/hwjs201703001 YANG Fengbao, DONG Anran, ZHANG Lei, et al. Infrared polarization image fusion using the synergistic combination of DWT, NSCT and improved PCA[J]. Infrared Technology, 2017, 39(3): 201-208. http://hwjs.nvir.cn/article/id/hwjs201703001
[11] 沈薛晨, 刘钧, 高明. 基于小波-Contourlet变换的偏振图像融合算法[J]. 红外技术, 2020, 42(2): 182-189. http://hwjs.nvir.cn/article/id/hwjs202002013 SHEN Xuechen, LIU Jun, GAO Ming. Polarization image fusion algorithm based on Wavelet-Contourlet transform[J]. Infrared Technology, 2020, 42(2): 182-189. http://hwjs.nvir.cn/article/id/hwjs202002013
[12] 张雨晨, 李江勇. 基于小波变换的中波红外偏振图像融合[J]. 激光与红外, 2020, 50(5): 578-582. ZHANG Yuchen, LI Jiangyong. Polarization image fusion based on wavelet transform[J]. Laser & Infrared, 2020, 50(5): 578-582.
[13] 王策, 许素安. 基于Retinex和小波变换的水下偏振图像融合方法[J]. 应用激光, 2022, 42(8): 116-122. WANG Ce, XU Suan. Underwater polarization image fusion method based on Retinex and wavelet transform[J]. Applied Laser, 2022, 42(8): 116-122.
[14] 陈锦妮, 陈宇洋, 李云红, 等. 基于结构与分解的红外光强与偏振图像融合[J]. 红外技术, 2023, 45(3): 257-265. http://hwjs.nvir.cn/article/id/379e87a8-b9c0-4081-820c-ccd63f3fe4f0 CHEN Jinni, CHEN Yuyang, LI Yunhong, et al. Fusion of infrared intensity and polarized images based on structure and decomposition[J]. Infrared Technology, 2023, 45(3): 257-265. http://hwjs.nvir.cn/article/id/379e87a8-b9c0-4081-820c-ccd63f3fe4f0
[15] LIU Y, LIU S, WANG Z. A general framework for image fusion based on multiscale transform and sparse representation[J]. Information Fusion, 2015, 24(C): 147-164.
[16] 朱攀, 刘泽阳, 黄战华. 基于DTCWT和稀疏表示的红外偏振与光强图像融合[J]. 光子学报, 2017, 46(12): 207-215. ZHU Pan, LIU Zeyang, HUANG Zhanhua. Infrared polarization and intensity image fusion based on dual-tree complex wavelet transform and sparse representation[J]. Acta Photonica Sinica, 2017, 46(12): 207-215.
[17] ZHU P, LIU L, ZHOU X. Infrared polarization and intensity image fusion based on bivariate BEMD and sparse representation[J]. Multimedia Tools and Applications, 2021, 80(3): 4455-4471. DOI: 10.1007/s11042-020-09860-z
[18] ZHANG S, YAN Y, SU L, et al. Polarization image fusion algorithm based on improved PCNN[C]//Proceedings of SPIE-The International Society for Optical Engineering, 2013, 9045.
[19] 李世维, 黄丹飞, 王惠敏, 等. 基于BEMD和自适应PCNN的偏振图像融合[J]. 激光杂志, 2018, 39(3): 94-98. LI Shiwei, HUANG Danfei, WANG Huimin, et al. Polarization image fusion based on BEMD and adaptive PCNN[J]. Laser Journal, 2018, 39(3): 94-98.
[20] 于津强, 段锦, 陈伟民, 等. 基于NSST与自适应SPCNN的水下偏振图像融合[J]. 激光与光电子学进展, 2020, 57(6): 103-113. YU Jinqiang, DUAN Jin, CHEN Weimin, et al. Underwater polarization image fusion based on NSST and adaptive SPCNN[J]. Laser & Optoelectronics Progress, 2020, 57(6): 103-113.
[21] 叶松, 汤伟平, 孙晓兵, 等. 一种采用IHS空间表征偏振遥感图像的方法[J]. 遥感信息, 2006, 21(2): 11-13. YE Song, TANG Weiping, SUN Xiaobing, et al. Characterization of the polarized remote sensing images using IHS color system[J]. Remote Sensing Information, 2006, 21(2): 11-13.
[22] 赵永强, 潘泉, 张洪才. 自适应多波段偏振图像融合研究[J]. 光子学报, 2007, 36(7): 1356-1359. ZHAO Yongqiang, PAN Quan, ZHANG Hongcai. Research on adaptive multi-band polarization image fusion[J]. Acta Photonica Sinica, 2007, 36(7): 1356-1359.
[23] 赵永强, 潘泉, 张洪才. 一种新的全色图像与光谱图像融合方法研究[J]. 光子学报, 2007, 36(1): 180-183. ZHAO Yongqiang, PAN Quan, ZHANG Hongcai. A new spectral and panchromatic images fusion method[J]. Acta Photonica Sinica, 2007, 36(1): 180-183.
[24] 周浦城, 韩裕生, 薛模根, 等. 基于非负矩阵分解和IHS颜色模型的偏振图像融合方法[J]. 光子学报, 2010, 39(9): 1682-1687. ZHOU Pucheng, HAN Yusheng, XUE Menggen, et al. Polarization image fusion method based on non-negative matrix factorization and IHS color model[J]. Acta Photonica Sinica, 2010, 39(9): 1682-1687.
[25] 周浦城, 张洪坤, 薛模根. 基于颜色迁移和聚类分割的偏振图像融合方法[J]. 光子学报, 2011, 40(1): 149-153. ZHOU Pucheng, ZHANG Hongkun, XUE Mogen. Polarization image fusion method using color transfer and clustering-based segmentation[J]. Acta Photonica Sinica, 2011, 40(1): 149-153.
[26] 李伟伟, 杨风暴, 蔺素珍, 等. 红外偏振与红外光强图像的伪彩色融合研究[J]. 红外技术, 2012, 34(2): 109-113. DOI: 10.3969/j.issn.1001-8891.2012.02.010 LI Weiwei, YANG Fengbao, LIN Suzhen, et al. Study on pseudo-color fusion of infrared polarization and intensity image[J]. Infrared Technology, 2012, 34(2): 109-113. DOI: 10.3969/j.issn.1001-8891.2012.02.010
[27] 孙晶. 多波段偏振图像融合方法研究[D]. 北京: 北京理工大学, 2019. SUN Jing. Research on Multi-band Polarization Image Fusion Method[D]. Beijing: Beijing Institute of Technology, 2019.
[28] 苏子航. 多波段偏振图像信息校正与增强技术研究[D]. 北京: 北京理工大学, 2021. SU Zihang. Research on Multi-band Polarization Image Information Correction and Enhancement Technology[D]. Beijing: Beijing Institute of Technology, 2021.
[29] HU J, MOU L, Schmitt A, et al. FusioNet: a two-stream convolutional neural network for urban scene classification using PolSAR and hyperspectral data[C]//Proceedings of the 2017 Joint Urban Remote Sensing Event (JURSE), 2017: 1-4.
[30] ZHANG J, SHAO J, CHEN J, et al. PFNet: an unsupervised deep network for polarization image fusion[J]. Optics Letters, 2020, 45(6): 1507-1510. DOI: 10.1364/OL.384189
[31] WANG S, MENG J, ZHOU Y, et al. Polarization image fusion algorithm using NSCT and CNN[J]. Journal of Russian Laser Research, 2021, 42(4): 443-452. DOI: 10.1007/s10946-021-09981-2
[32] ZHANG J, SHAO J, CHEN J, et al. Polarization image fusion with self-learned fusion strategy[J]. Pattern Recognition, 2021, 118(22): 108045.
[33] XU H, SUN Y, MEI X, et al. Attention-Guided polarization image fusion using salient information distribution[J]. IEEE Transactions on Computational Imaging, 2022, 8: 1117-1130. DOI: 10.1109/TCI.2022.3228633
[34] 闫德利, 申冲, 王晨光, 等. 强度图像和偏振度图像融合网络的设计[J]. 光学精密工程, 2023, 31(8): 1256-1266. YAN Deli, SHEN Chong, WANG Chenguang, et al. Design of intensity image and polarization image fusion network[J]. Optics and Precision Engineering, 2023, 31(8): 1256-1266.
[35] Goodfellow I, Pouget-Abadie J, Mirza M, et al. Generative adversarial nets[C]//Advances in Neural Information Processing Systems, 2014: 2672-2680.
[36] MA J, YU W, LIANG P, et al. FusionGAN: a generative adversarial network for infrared and visible image fusion[J]. Information Fusion, 2019, 48: 11-26. DOI: 10.1016/j.inffus.2018.09.004
[37] ZHAO C, WANG T, LEI B, Medical image fusion method based on dense block and deep convolutional generative adversarial network[J]. Neural Comput. & Applic., 2021, 33: 6595-6610.
[38] LIU Q, ZHOU H, XU Q, et al. PSGAN: a generative adversarial network for remote sensing image pan-sharpening[J]. IEEE Transactions on Geoscience and Remote Sensing, 2021, 59(12): 10227-10242. DOI: 10.1109/TGRS.2020.3042974
[39] MA J, XU H, JIANG J, et al. DDcGAN: a dual-discriminator conditional generative adversarial network for multi-resolution image fusion[J]. IEEE Transactions on Image Processing, 2020, 29: 4980-4995. DOI: 10.1109/TIP.2020.2977573
[40] LI J, HUO H, LI C, et al. Attention FGAN: infrared and visible image fusion using attention-based generative adversarial networks[J]. IEEE Transactions on Multimedia, 2021, 23: 1383-1396. DOI: 10.1109/TMM.2020.2997127
[41] MA J, ZHANG H, SHAO Z, et al. GANMcC: a generative adversarial network with multi-classification constraints for infrared and visible image fusion[J]. IEEE Transactions on Instrumentation and Measurement, 2021, 70: 1-14.
[42] WEN Z, WU Q, LIU Z, et al. Polar-spatial feature fusion learning with variational generative-discriminative network for PolSAR classi-fication[J]. IEEE Transactions on Geoscience and Remote Sensing, 2019, 57(11): 8914-8927. DOI: 10.1109/TGRS.2019.2923738
[43] DING X, WANG Y, FU X. Multi-polarization fusion generative adversarial networks for clear underwater imaging[J]. Optics and Lasers in Engineering, 2022, 152: 106971. DOI: 10.1016/j.optlaseng.2022.106971
[44] LIU J, DUAN J, HAO Y, et al. Semantic-guided polarization image fusion method based on a dual-discriminator GAN[J]. Optic Express, 2022, 30: 43601-43621. DOI: 10.1364/OE.472214
[45] SUN R, SUN X, CHEN F, et al. An artificial target detection method combining a polarimetric feature extractor with deep convolutional neural networks[J]. International Journal of Remote Sensing, 2020, 41: 4995-5009. DOI: 10.1080/01431161.2020.1727584
[46] ZHANG Y, Morel O, Blanchon M, et al. Exploration of deep learning based multimodal fusion for semantic road scene segmen-tation[C]//14th International Conference on Computer Vision Theory and Applications, 2019: 336-343.
[47] XIANG K, YANG K, WANG K. Polarization-driven semantic segmentation via efficient attention-bridged fusion[J]. Optic Express, 2021, 29: 4802-4820. DOI: 10.1364/OE.416130
[48] 霍永胜. 基于偏振的暗通道先验去雾[J]. 物理学报, 2022, 71(14): 112-120. HUO Yongsheng. Polarization-based research on a priori defogging of dark channel[J]. Acta Physica Sinica, 2022, 71(14): 112-120.
[49] 孟宇飞, 王晓玲, 刘畅, 等. 四分暗通道均值比较法的双角度偏振图像去雾[J]. 激光与光电子学进展, 2022, 59(4): 232-240. MENG Yufei, WANG Xiaoling, LIU Chang, et al. Dehazing of dual angle polarization image based on mean comparison of quartering dark channels[J]. Laser & Optoelectronics Progress, 2022, 59(4): 232-240.
[50] 张肃, 战俊彤, 付强, 等. 基于多小波融合的偏振探测去雾技术[J]. 激光与光电子学进展, 2018, 55(12): 468-477. ZHANG Su, ZHAN Juntong, FU Qiang, et al. Polarization detection defogging technology based on multi-wavelet fusion[J]. Laser & Optoelectronics Progress, 2018, 55(12): 468-477.
[51] HUANG F, KE C, WU X, et al. Polarization dehazing method based on spatial frequency division and fusion for a far-field and dense hazy image[J]. Applied Optics, 2021, 60: 9319-9332. DOI: 10.1364/AO.434886
[52] 周文舟, 范晨, 胡小平, 等. 多尺度奇异值分解的偏振图像融合去雾算法与实验[J]. 中国光学, 2021, 14(2): 298-306. ZHOU Wenzhou, FAN Chen, HU Xiaoping, et al. multi-scale singular value decomposition polarization image fusion defogging algorithm and experiment[J]. Chinese Optics, 2021, 14(2): 298-306.
[53] 李轩, 刘飞, 邵晓鹏. 偏振三维成像技术的原理和研究进展[J]. 红外与毫米波学报, 2021, 40(2): 248-262. LI Xuan, LIU Fei, SHAO Xiaopeng. Research progress on polarization 3D imaging technology[J]. Journal of Infrared and Millimeter Waves, 2021, 40(2): 248-262.
[54] 王霞, 赵雨薇, 金伟其. 融合光学偏振的三维成像技术进展(特邀)[J]. 光电技术应用, 2022, 37(5): 33-43. WANG Xia, ZHAO Yuwei, JIN Weiqi. Overview of polarization-based three-dimensional imaging techniques(Invited)[J]. Opto-electronic Technology Application, 2022, 37(5): 33-43.
[55] 杨锦发, 晏磊, 赵红颖, 等. 融合粗糙深度信息的低纹理物体偏振三维重建[J]. 红外与毫米波学报, 2019, 38(6): 819-827. YANG Jinfa, YAN Lei, ZHAO Hongying, et al. Shape from polarization of low-texture objects with rough depth information[J]. Journal of Infrared and Millimeter Waves, 2019, 38(6): 819-827.
[56] 张瑞华, 施柏鑫, 杨锦发, 等. 基于视差角和天顶角优化的偏振多视角三维重建[J]. 红外与毫米波学报, 2021, 40(1): 133-142. ZHANG Ruihua, SHI Baixin, YANG Jinfa, et al. Polarization multi-view 3D reconstruction based on parallax angle and zenith angle optimization[J]. Journal of Infrared and Millimeter Wave, 2021, 40(1): 133-142.
[57] BA Y, Gilbert A, WANG F, et al. Deep shape from polarization[C]//Computer Vision–ECCV 2020: 16th European Conference, 2020: 554-571.
[58] 陈创斌. 基于偏振信息的表面法线估计[D]. 广州: 广东工业大学, 2021. CHEN Chuangbin. Surface Normal Estimation Based on Polarization Information[D]. Guangzhou: Guangdong University of Technology, 2021.
[59] 王晓敏. 融合偏振和光场信息的低纹理目标三维重建算法研究[D]. 太原: 中北大学, 2022. WANG Xiaomin. Research on Low Texture Target 3D Reconstruction Algorithm Integrating Polarization and Light Field Information[D]. Taiyuan: North University of China, 2022.
[60] ZENG X, LUO Y, ZHAO X, et al. An end-to-end fully-convolutional neural network for division of focal plane sensors to reconstruct S0, DoLP, and AoP[J]. Optic Express, 2019, 27: 8566-8577. DOI: 10.1364/OE.27.008566
[61] Morimatsu M, Monno Y, Tanaka M, et al. Monochrome and color polarization demosaicking using edge-aware residual interpolation [C]//2020 IEEE International Conference on Image Processing(ICIP), 2020: 2571-2575.
[62] LI N, ZHAO Y, PAN Q, et al. Full-time monocular road detection using zero-distribution prior of angle of polarization[C]//European Conference on Computer Vision (ECCV), 2020: 457-473.
[63] LI N, ZHAO Y, PAN Q, et al. Illumination-invariant road detection and tracking using LWIR polarization characteristics[J]. ISPRS Journal of Photogrammetry and Remote Sensing, 2021, 180: 357-369. DOI: 10.1016/j.isprsjprs.2021.08.022
[64] SUN Y, ZHANG J, LIANG R. Color polarization demosaicking by a convolutional neural network[J]. Optic Letter, 2021, 46: 4338-4341. DOI: 10.1364/OL.431919
[65] QIU S, FU Q, WANG C, et al. Linear polarization demosaicking for monochrome and colour polarization focal plane arrays[J]. Computer Graphics Forum, 2021, 40: 77-89. DOI: 10.1111/cgf.14204