Abstract
Background: The need for accurate and timely detection of Intracranial hemorrhage (ICH) is of utmost importance to avoid untoward incidents that may even lead to death. Hence, this presented work leverages the ability of a pretrained deep convolutional neural network (CNN) for the detection of ICH in computed tomography (CT) brain images.
Methods: Different frameworks have been analyzed for their effectiveness for the classification of CT brain images into hemorrhage or non-hemorrhage conditions. All these frameworks were investigated on the CQ500 dataset. Furthermore, an exclusive preprocessing pipeline was designed for both normal and ICH CT images. Firstly, a framework involving the pretrained deep CNN, AlexNet, has been exploited for both feature extraction and classification using the transfer learning method. Secondly, a modified AlexNet-Support vector machine (SVM) classifier is explored, and finally, a feature selection method, Principal Component Analysis (PCA), has been introduced in the AlexNet-SVM classifier model, and its efficacy is also explored. These models were trained and tested on two different sets of CT images, one containing the original images without preprocessing and another set consisting of preprocessed images.
Results: The modified AlexNet-SVM classifier has shown an improved performance in comparison to the other investigated frameworks and has achieved a classification accuracy of 99.86% and sensitivity and specificity of 0.9986 for the detection of ICH in the brain CT images.
Conclusion: This research has given an overview of a simple and efficient framework for the classification of hemorrhage and non-hemorrhage images. The proposed simplified deep learning framework also manifests its ability as a screening tool to assist the radiological trainees in the accurate detection of ICH.
Keywords: Deep CNN, transfer learning, classification, intracranial hemorrhage, ICH, principal component analysis.
Graphical Abstract
[http://dx.doi.org/10.3389/fphar.2019.01079]
[http://dx.doi.org/10.3389/fphar.2019.01079] [PMID: 31607923]
[http://dx.doi.org/10.1093/bja/aev379] [PMID: 26658203]
[http://dx.doi.org/10.1007/s00330-019-06163-2] [PMID: 31041565]
[http://dx.doi.org/10.1155/2019/4629859] [PMID: 31281335]
[http://dx.doi.org/10.3390/data5010014]
[http://dx.doi.org/10.1038/s41746-017-0015-z]
[PMID: 32604679]
[http://dx.doi.org/10.1007/s00234-019-02330-w] [PMID: 31828361]
[http://dx.doi.org/10.1007/s11042-016-3540-x]
[http://dx.doi.org/10.1109/ICOSC.2019.8665578]
[http://dx.doi.org/10.1016/j.artmed.2020.101850] [PMID: 32593388]
[http://dx.doi.org/10.1016/j.imu.2020.100321]
[http://dx.doi.org/10.1038/s41551-018-0324-9] [PMID: 30948806]
[http://dx.doi.org/10.1016/S0140-6736(18)31645-3] [PMID: 30318264]
[http://dx.doi.org/10.3389/fninf.2019.00061] [PMID: 31551745]
[http://dx.doi.org/10.1109/ACCESS.2018.2877890]