Abstract:
This study proposed a multisource fusion network (MF-Net) that combines visible and infrared images for the inspection of a photovoltaic panel to achieve photovoltaic panel defect detection, defect classification, and localization. The limitations of the traditional methods include low efficiency, low accuracy, and high cost. In this study, a defect detection network was designed based on the backbone of YOLOv3-tiny. Deep layers are added to the network, constituting a dense block structure to augment semantic information on fused feature maps. The detection scale of the network was extended to improve its applicability at different scales. In addition, an adaptive weight fusion strategy was proposed to achieve feature map fusion, where the fusion coefficients can be allocated according to the pixel neighborhood information. Compared with the backbone, the results show that the mAP of our network improved by 7.41%. The performance improves (by approximately 2.14% mAP) when the weighted fusion strategy is replaced with ours, and the significance of the features can be effectively improved. Relative to other networks, the proposed network that takes the fused images as the input has the highest performance in terms of the F1 score (F1=0.86).