Abstract
Background: The fusion of infrared images and visible images has been a hot topic in the field of image fusion. In the process of image fusion, different methods of feature extraction and processing will directly affect the fusion performance.
Objectives: Low resolution (small size) of high-level features will lead to the loss of spatial information. On the other side, the low-level features are not significant due to their insufficient filtering of background and noise.
Methods: In order to solve the problem of insufficient feature utilization in existing methods, a new fusion approach (SC-Fuse) based on self-calibrated residual networks (SCNet) and feature embedding has been proposed. The method improves the quality of image fusion from two aspects: feature extraction and feature processing.
Results: First, self-calibrated modules are applied to the field of image fusion for the first time, which enlarged the receptive field to make feature maps contain more information. Second, we use ZCA (Zero-phase Component Analysis) and l1-norm to process features, and propose a feature embedding operation to realize the complementarity of feature information at different levels.
Conclusion: Finally, a suitable strategy is given to reconstruct the fused image. After ablation experiments and comparison with other representative algorithms, the results show the effectiveness and superiority of SC-Fuse.
Keywords: Image fusion, self-calibrated convolutions, features extraction, features embedding, image reconstruction
Graphical Abstract
[http://dx.doi.org/10.1109/CVPRW50498.2020.00060]
[http://dx.doi.org/10.1016/j.inffus.2018.02.004]
[http://dx.doi.org/10.1049/iet-ipr.2009.0374]
[http://dx.doi.org/10.1016/j.patrec.2018.10.002]
[http://dx.doi.org/10.1016/j.imavis.2006.01.017]
[http://dx.doi.org/10.1364/AO.55.006480] [PMID: 27534499]
[http://dx.doi.org/10.1007/978-3-030-01249-6_17]
[http://dx.doi.org/10.1109/TPAMI.2013.50] [PMID: 23787338]
[http://dx.doi.org/10.1117/1.OE.52.5.057006]
[http://dx.doi.org/10.1109/LSP.2016.2618776]
[http://dx.doi.org/10.1109/ICPR.2018.8546006]
[http://dx.doi.org/10.1016/j.infrared.2019.103039]
[http://dx.doi.org/10.1016/j.infrared.2020.103237]
[http://dx.doi.org/10.1016/j.inffus.2018.09.004]
[http://dx.doi.org/10.1016/j.ins.2020.04.035]
[http://dx.doi.org/10.1016/j.inffus.2019.07.005]
[http://dx.doi.org/10.1007/978-3-030-67070-2_27]
[http://dx.doi.org/10.1145/3394171.3413559]
[http://dx.doi.org/10.1016/S0141-9382(97)00014-0]
[http://dx.doi.org/10.1049/el:20000267]
[http://dx.doi.org/10.1016/j.aeue.2015.09.004]
[http://dx.doi.org/10.1109/TIP.2015.2442920] [PMID: 26068317]
[http://dx.doi.org/10.1007/s11760-013-0556-9]
[http://dx.doi.org/10.1016/j.inffus.2016.02.001]
[http://dx.doi.org/10.1109/TIP.2020.2975984] [PMID: 32142438]
[http://dx.doi.org/10.1109/TMM.2013.2244870]
[http://dx.doi.org/10.1016/j.infrared.2016.01.009]
[http://dx.doi.org/10.1016/j.infrared.2017.05.007]
[http://dx.doi.org/10.1016/j.infrared.2017.02.005]
[http://dx.doi.org/10.1142/S0219691318500182]
[http://dx.doi.org/10.1109/ICAICT.2014.7036000]
[http://dx.doi.org/10.1049/iet-ipr.2019.0322]
[http://dx.doi.org/10.2174/1573405612666160609131149]