Abstract
Background: Faced with the global threat posed by SARS-CoV-2 (COVID-19), lowdose computed tomography (LDCT), as the primary diagnostic tool, is often accompanied by high levels of noise. This can easily interfere with the radiologist's assessment. Convolutional neural networks (CNN), as a method of deep learning, have been shown to have excellent effects in image denoising.
Objective: The objective of the study was to use modified convolutional neural network algorithm to train the denoising model. The purpose was to make the model extract the highlighted features of the lesion region better and ensure its effectiveness in removing noise from COVID-19 lung CT images, preserving more important detail information of the images and reducing the adverse effects of denoising.
Methods: We propose a CNN-based deformable convolutional denoising neural network (DCDNet). By combining deformable convolution methods with residual learning on the basis of CNN structure, more image detail features are retained in CT image denoising.
Results: According to the noise reduction evaluation index of PSNR, SSIM and RMSE, DCDNet shows excellent denoising performance for COVID-19 CT images. From the visual effect of denoising, DCDNet can effectively remove image noise and preserve more detailed features of lung lesions.
Conclusion: The experimental results indicate that the DCDNet-trained model is more suitable for image denoising of COVID-19 than traditional image denoising algorithms under the same training set.
Keywords: COVID-19 CT image denoising, convolutional neural network, deformable convolution, residual learning, noise reduction evaluation index, improved CT image quality
Graphical Abstract
[http://dx.doi.org/10.1002/ctm2.16] [PMID: 32508022]
[http://dx.doi.org/10.1016/j.lfs.2021.119341] [PMID: 33716059]
[http://dx.doi.org/10.1080/07853890.2020.1851044] [PMID: 33426973]
[http://dx.doi.org/10.3348/kjr.2020.0171] [PMID: 32410412]
[http://dx.doi.org/10.2139/ssrn.3689090]
[http://dx.doi.org/10.1016/j.rmed.2021.106383] [PMID: 33839588]
[http://dx.doi.org/10.1007/s00371-020-01996-1]
[http://dx.doi.org/10.1109/RAICS.2011.6069359]
[http://dx.doi.org/10.1109/CVPR.2005.38]
[http://dx.doi.org/10.1118/1.4851635] [PMID: 24387516]
[http://dx.doi.org/10.1109/TIP.2007.901238] [PMID: 17688213]
[http://dx.doi.org/10.1016/j.compmedimag.2011.07.002] [PMID: 21821394]
[http://dx.doi.org/10.1109/ICIP.2008.4712115]
[http://dx.doi.org/10.1016/j.neunet.2019.12.024] [PMID: 31991307]
[http://dx.doi.org/10.1109/NSSMIC.2015.7582055]
[http://dx.doi.org/10.1007/s11704-020-9050-z]
[http://dx.doi.org/10.1016/j.knosys.2021.106949]
[http://dx.doi.org/10.1109/TIP.2017.2662206] [PMID: 28166495]
[http://dx.doi.org/10.1109/TIP.2018.2839891] [PMID: 29993717]
[http://dx.doi.org/10.1109/CVPR.2019.00181]
[http://dx.doi.org/10.1109/CVPR.2018.00344]
[http://dx.doi.org/10.1109/ICCV.2017.89]
[http://dx.doi.org/10.1109/CVPR.2019.00953]
[http://dx.doi.org/10.1145/3065386]
[http://dx.doi.org/10.1016/j.jksuci.2022.02.025]
[http://dx.doi.org/10.1109/CVPR.2016.90]