Abstract
Background: Image registration is the process of aligning two or more images in a single coordinate. Nowadays, medical image registration plays a significant role in computer-assisted disease diagnosis, treatment, and surgery. The different modalities available in the medical image make medical image registration an essential step in Computer Assisted Diagnosis (CAD), Computer- Aided Therapy (CAT) and Computer-Assisted Surgery (CAS).
Problem Definition: Recently, many learning-based methods were employed for disease detection and classification, but those methods were not suitable for real-time due to delayed response and the need for pre-alignment and labeling.
Methods: The proposed research constructed a deep learning model with Rigid transform and B-Spline transform for medical image registration for an automatic brain tumour finding. The proposed research consists of two steps. The first step uses Rigid transformation based Convolutional Neural Network and the second step uses B-Spline transform-based Convolutional Neural Network. The model is trained and tested with 3624 MR (Magnetic Resonance) images to assess the performance. The researchers believe that MR images help in the success of the treatment of patients with brain tumour.
Results: The result of the proposed method is compared with the Rigid Convolutional Neural Network (CNN), Rigid CNN + Thin-Plat Spline (TPS), Affine CNN, Voxel morph, ADMIR (Affine and Deformable Medical Image Registration) and ANT(Advanced Normalization Tools) using DICE score, Average Symmetric surface Distance (ASD), and Hausdorff distance.
Conclusion: The RBCNN model will help the physician to automatically detect and classify the brain tumor quickly (18 Sec) and efficiently without doing pre-alignment and labeling.
Keywords: Medical Image Registrations, deep learning, rigid transformation, B-spline transform, convolutional neural network, brain tumor magnetic resonance images, advanced normalization tools.
Graphical Abstract
[http://dx.doi.org/10.1007/s00521-020-05238-2]
[http://dx.doi.org/10.1016/j.eswa.2019.05.041]
[http://dx.doi.org/10.1007/s10916-019-1416-0] [PMID: 31342192]
[http://dx.doi.org/10.1007/s00521-019-04041-y]
[http://dx.doi.org/10.1016/j.asoc.2019.105528]
[http://dx.doi.org/10.1049/el.2018.6713]
[http://dx.doi.org/10.1007/978-3-030-20351-1_20]
[http://dx.doi.org/10.1007/978-3-030-20890-5_33]
[http://dx.doi.org/10.1016/j.media.2018.11.010] [PMID: 30579222]
[http://dx.doi.org/10.1016/j.media.2019.03.006] [PMID: 30939419]
[http://dx.doi.org/10.1109/JBHI.2019.2951024]
[http://dx.doi.org/10.1016/S1053-8119(01)91592-7]
[http://dx.doi.org/10.1109/TMI.2019.2897538]
[http://dx.doi.org/10.1007/978-3-319-67558-9_24]
[http://dx.doi.org/10.1109/ISBI.2017.7950652]
[http://dx.doi.org/10.1007/978-3-658-25326-4_69]
[http://dx.doi.org/10.1007/978-3-030-01045-4_18]
[http://dx.doi.org/10.1007/978-3-642-40763-5_80] [PMID: 24579196]
[http://dx.doi.org/10.1007/978-3-319-66182-7_27]
[http://dx.doi.org/10.1007/978-3-319-67558-9_14]
[http://dx.doi.org/10.1007/978-3-319-66182-7_35]
[http://dx.doi.org/10.1007/978-3-319-46726-9_2]
[http://dx.doi.org/10.1109/ISBI.2018.8363757]
[http://dx.doi.org/10.5220/0006543700890099]
[http://dx.doi.org/10.2174/1573405616666200628134800] [PMID: 32598263]
[http://dx.doi.org/10.2174/1573405616666200712180521] [PMID: 32652918]
[http://dx.doi.org/10.2174/1573405616666200806171509] [PMID: 32767946]
[http://dx.doi.org/10.2174/1573405616999200918150259] [PMID: 32957890]
[http://dx.doi.org/10.2174/1573405616666200311122429] [PMID: 32160848]
[http://dx.doi.org/10.1111/iwj.13265] [PMID: 31710171]
[http://dx.doi.org/10.1111/tbj.14072] [PMID: 33037830]
[http://dx.doi.org/10.31181/rme200101010p]
[http://dx.doi.org/10.31181/rme200101077m]
[http://dx.doi.org/10.31181/dmame1901035s]
[http://dx.doi.org/10.31181/dmame2104051g]