Generic placeholder image

Current Medical Imaging

Editor-in-Chief

ISSN (Print): 1573-4056
ISSN (Online): 1875-6603

Research Article

Multi-modal Medical Image Fusion Algorithm Based on Spatial Frequency Motivated PA-PCNN in the NSST Domain

Author(s): K. Vanitha*, D. Satyanarayana and M.N.G. Prasad

Volume 17, Issue 5, 2021

Published on: 18 November, 2020

Page: [634 - 643] Pages: 10

DOI: 10.2174/1573405616666201118123220

Abstract

Background: Image fusion has been grown as an effectual method in diseases related diagnosis schemes. Methods: In this paper, a new method for combining multimodal medical images using spatial frequency motivated parameter-adaptive PCNN (SF-PAPCNN) is suggested. The multi- modal images are disintegrated into frequency bands by using decomposition NSST. The coefficients of low frequency bands are selected using maximum rule. The coefficients of high frequency bands are combined by SF-PAPCNN.

Methods: In this paper, a new method for combining multimodal medical images using spatial frequency motivated parameter-adaptive PCNN (SF-PAPCNN) is suggested. The multi-modal images are disintegrated into frequency bands by using decomposition NSST. The coefficients of low frequency bands are selected using maximum rule. The coefficients of high frequency bands are combined by SF-PAPCNN.

Results: The fused medical images is obtained by applying INSST to above coefficients.

Conclusion: The quality metrics such as entropy ENT, fusion symmetry FS, deviation STD, mutual information QMI and edge strength QAB/F are used to validate the efficacy of suggested scheme.

Keywords: Image fusion, spatial frequency, non-subsampled shearlet transform (NSST), parameter-adaptive pulse coupled neural network (PAPCNN), medical imaging, diseases.

Graphical Abstract

[1]
Fatmael-Zahra A-G. Current trends in medical image registration and fusion. Egyptian Inform J 2016; 17(1): 99-124.
[http://dx.doi.org/10.1016/j.eij.2015.09.002]
[2]
James AP, Dasarathy BV. Medical image fusion: A survey of the state of the art. Inf Fusion 2014; 19: 4-19.
[http://dx.doi.org/10.1016/j.inffus.2013.12.002]
[3]
Li S, Kang X, Fang L, et al. Pixel-level image fusion: a survey of the state of the art. Inf Fusion 2017; 33: 100-12.
[http://dx.doi.org/10.1016/j.inffus.2016.05.004]
[4]
Xiangchao M, Huanfeng S, Huifang L, Liangpei Z, Randi F. Review of the pansharpening methods for remote sensing images based on the idea of meta-analysis: practical discussion and challenges. Inf Fusion 2019; 46: 102-13.
[http://dx.doi.org/10.1016/j.inffus.2018.05.006]
[5]
Shreyamsha Kumar BK. Multi focus and multi spectral image fusion based on pixel significance using discrete cosine harmonic wavelet transform. Signal Image Video Process 2013; 7(6): 1125-43.
[http://dx.doi.org/10.1007/s11760-012-0361-x]
[6]
Yang Y, Park DS, Huang S, Rao N. Medical image fusion via an effective wavelet-based approach. EURASIP J Adv Signal Process 2010; Article number: 579341.
[http://dx.doi.org/10.1155/2010/579341]
[7]
Vijayarajan R, Muttan S. Discrete wavelet transform based principal component averaging fusion for medical images. AEU 2015; 69(6): 896-902.
[http://dx.doi.org/10.1016/j.aeue.2015.02.007]
[8]
Mishra HOS, Bhatnagar S. MRI and CT image fusion based on wavelet transform. Int J Inform Comput Technol 2014; 4(1): 47-52.
[9]
Bhateja V, Patel H, Krishna A, Sahu A, Lay-Ekuakille A. Multimodal medical image sensor fusion framework using cascade of wavelet and contourlet transform domains. IEEE Sens J 2015; 15(12): 6783-90.
[http://dx.doi.org/10.1109/JSEN.2015.2465935]
[10]
Yang Y, Que Y, Huang S, Lin P. Multi modal sensor medical image fusion based on type-2 fuzzy logic in NSCT domain. IEEE Sens J 2016; 16(10): 3735-45.
[http://dx.doi.org/10.1109/JSEN.2016.2533864]
[11]
Tirupal T, Chandra Mohan B, Srinivas Kumar S. Multimodal medical image fusion based on sugeno’s intuitionistic fuzzy sets. ETRI J 2017; 39(2): 173-80.
[http://dx.doi.org/10.4218/etrij.17.0116.0568]
[12]
Tirupal T, Chandra Mohan B, Srinivas Kumar S. Multimodal med-ical image fusion based on yager’s intuitionistic fuzzy sets. Iranian J Fuzzy Syst 2019; 16(1): 33-48.
[13]
Guorong G, Luping X, Dongzhu F. Multi-focus image fusion based on nonsubsampled shearlet transform. IET Image Process 2013; 7(6): 633-9.
[http://dx.doi.org/10.1049/iet-ipr.2012.0558]
[14]
Wang Z, Ma Y. Medical image fusion using m-PCNN. Inf Fusion 2008; 9: 176-85.
[http://dx.doi.org/10.1016/j.inffus.2007.04.003]
[15]
Chen Y, Park SK, Ma Y, Ala R. A new automatic parameter setting method of a simplified PCNN for image segmentation. IEEE Trans Neural Netw 2011; 22(6): 880-92.
[http://dx.doi.org/10.1109/TNN.2011.2128880] [PMID: 21550882]
[16]
Liu Y, Liu S, Wang Z. A general framework for image fusion based on multi-scale transform and sparse representation. Inf Fusion 2015; 24: 147-64.
[http://dx.doi.org/10.1016/j.inffus.2014.09.004]
[17]
Zhao W, Lu H. Medical image fusion and denoising with alternating sequential filter and adaptive fractional order total variation. IEEE Trans Instrum Meas 2017; 66(9): 2283-94.
[http://dx.doi.org/10.1109/TIM.2017.2700198]
[18]
Bhatnagar G, Wu QMJ, Liu Z. A new contrast based multimodal medical image fusion framework. Neurocomputing 2015; 157: 143-52.
[http://dx.doi.org/10.1016/j.neucom.2015.01.025]
[19]
Li S, Kang X, Hu J. Image fusion with guided filtering. IEEE Trans Image Process 2013; 22(7): 2864-75.
[http://dx.doi.org/10.1109/TIP.2013.2244222] [PMID: 23372084]
[20]
Vanitha K. A new hybrid approach for multi –modal medical image fusion. JARDCS 12(3): 221-30.
[21]
Qu XB, Yan JW, Xiao HZ, Zhu ZQ. Image fusion algorithm based on spatial frequency-motivated pulse coupled neural networks in non subsampled contourlet transform domain. Acta Autom Sin 2008; 34(12): 1508-14.
[http://dx.doi.org/10.3724/SP.J.1004.2008.01508]
[22]
Bhatnagar G, Wu QMJ, Liu Z. Directive contrast based multimodal medical image fusion in NSCT domain. IEEE Trans Multimed 2013; 15(5): 1014-24.
[http://dx.doi.org/10.1109/TMM.2013.2244870]
[23]
Du J, Li W, Xiao B. Anatomical-functional image fusion by information of interest in local Laplacian filtering domain. IEEE Trans Image Process 2017; 26(12): 5855-66.
[http://dx.doi.org/10.1109/TIP.2017.2745202] [PMID: 28858799]
[24]
Yin M, Liu X, Liu Y, Chen X. Medical image fusion with parameter-adaptive pulse coupled neural network in non sub sampled shearlet transform domain. IEEE Trans Instrum Meas 2018; 68(1): 49-64.
[http://dx.doi.org/10.1109/TIM.2018.2838778]
[25]
Zhu Z, Zheng M, Qi G, Wang D, Xiang Y. A phase congruency and local laplacian energy based multi-modality medical image fusion method in NSCT domain. IEEE Access 20197: 20811-4.
[http://dx.doi.org/10.1109/ACCESS.2019.2898111]
[26]
Vanitha K, Satyanrayana D, Giri Prasad MN. Medical image fusion based on weighted local energy motivated PAPCNN in NSST domain. JARDCS 2020; 12(3): 960-7.
[27]
Xu H, Ma J, Jiang J, Guo X, Ling H. U2Fusion: A unified unsupervised image fusion network. IEEE Transactions on Pattern Analysis and Machine Intelligence (Early Access ) 2020.
[28]
Zhang H, Xu H, Xiao Y, Guo X, Ma J. Rethinking the image fusion: a fast unified image fusion network based on proportional maintenance of gradient and intensity. AAAI 2020; 34(7): AAAI-20 Technical Tracks 7.
[http://dx.doi.org/10.1609/aaai.v34i07.6975]
[29]
Xu H, Ma J, Le Z, Jiang J, Guo X. FusionDN: A unified densely connected network for image fusion. AAAI 2020; 34(7): AAAI-20 Technical Tracks 7.
[30]
Ma J, Xu H, Jiang J, Mei X, Zhang X-P. DDcGAN: A Dual-discriminator conditional generative adversarial network for multi-resolution image fusion. IEEE Trans Image Process 2020; 29: 4980-95.
[http://dx.doi.org/10.1109/TIP.2020.2977573] [PMID: 32167894]
[31]
Johnson KA, Alex Becker J. The whole brain. Available at: www.med.harvard.edu/AANLIB/home.html
[32]
Jagalingam P, Hegde AK. A review of quality metrics for fused image. Aquatic Procedia 2015; 4: 133-42.
[http://dx.doi.org/10.1016/j.aqpro.2015.02.019]
[33]
Liu Z, Blasch E, Xue Z, Zhao J, Laganiere R, Wu W. Objective assessment of multi resolution image fusion algorithms for context enhancement in night vision: a comparative study. IEEE Trans Pattern Anal Mach Intell 2012; 34(1): 94-109.
[http://dx.doi.org/10.1109/TPAMI.2011.109] [PMID: 21576753]
[34]
A large-scale benchmark data set for evaluating pan sharpening performance: overview and implementation. IEEE Geosci Remote Sens Mag 2020.

© 2024 Bentham Science Publishers | Privacy Policy