Abstract
Background: Detection of brain tumor is a complicated task, which requires specialized skills and interpretation techniques. Accurate brain tumor classification and segmentation from MR images provide an essential choice for medical treatments. Different objects within an MR image have similar size, shape, and density, which makes the tumor classification and segmentation even more complex.
Objective: Classification of the brain MR images into tumorous and non-tumorous using deep features and different classifiers to get higher accuracy.
Methods: In this study, a novel four-step process is proposed; pre-processing for image enhancement and compression, feature extraction using convolutional neural networks (CNN), classification using the multilayer perceptron and finally, tumor segmentation using enhanced fuzzy cmeans method.
Results: The system is tested on 65 cases in four modalities consisting of 40,300 MR Images obtained from the BRATS-2015 dataset. These include images of 26 Low-Grade Glioma (LGG) tumor cases and 39 High-Grade Glioma (HGG) tumor cases. The proposed CNN feature-based classification technique outperforms the existing methods by achieving an average accuracy of 98.77% and a noticeable improvement in the segmentation results are measured.
Conclusion: The proposed method for brain MR image classification to detect Glioma Tumor detection can be adopted as it gives better results with high accuracies.
Keywords: Tumor detection, MR image classification, CNN Features, glioma Tumor, tumor Segmentation, brain MRI.
Graphical Abstract
[http://dx.doi.org/10.1016/j.nicl.2013.04.005] [PMID: 24179806]
[http://dx.doi.org/10.1007/s00401-007-0243-4] [PMID: 17618441]
[PMID: 3708996]
[http://dx.doi.org/10.1016/j.ejca.2007.06.007] [PMID: 17681781]
[http://dx.doi.org/10.1016/j.mri.2013.05.002] [PMID: 23790354]
[http://dx.doi.org/10.1109/TMI.2014.2377694] [PMID: 25494501]
[http://dx.doi.org/10.5815/ijigsp.2016.11.08]
[http://dx.doi.org/10.1016/j.cmpb.2016.10.021] [PMID: 28187893]
[http://dx.doi.org/10.1007/s11548-015-1311-1] [PMID: 26530300]
[http://dx.doi.org/10.1016/j.neunet.2014.09.003] [PMID: 25462637]
[http://dx.doi.org/10.1109/ASAR.2018.8480289]
[http://dx.doi.org/10.1016/j.patcog.2016.05.028]
[http://dx.doi.org/10.1007/978-3-642-33454-2_46]
[http://dx.doi.org/10.1016/j.patcog.2011.09.021]
[http://dx.doi.org/10.1109/ICDAR.2011.229]
[http://dx.doi.org/10.1016/j.media.2016.05.004] [PMID: 27310171]
[http://dx.doi.org/10.1109/TMI.2016.2538465] [PMID: 26960222]
[http://dx.doi.org/10.2174/1573405613666170614081434] [PMID: 31964329]
[http://dx.doi.org/10.2174/1573405614666180718123533] [PMID: 32008569]
[http://dx.doi.org/10.1007/s10916-019-1413-3] [PMID: 31327058]
[http://dx.doi.org/10.1109/ACCESS.2018.2888488]
[http://dx.doi.org/10.2174/1573405614666180402150218]
[http://dx.doi.org/10.2174/1573405615666191021123854] [PMID: 31989891]
[http://dx.doi.org/10.2174/1573405615666190716122040] [PMID: 32008524]
[http://dx.doi.org/10.1145/3277104.3278311]
[http://dx.doi.org/10.2174/1573405614666181017122109] [PMID: 31989889]
[http://dx.doi.org/10.1109/TIP.2009.2025553] [PMID: 19546041]
[http://dx.doi.org/10.1080/18756891.2012.696913]
[http://dx.doi.org/10.1109/34.56205]