Generic placeholder image

Recent Advances in Computer Science and Communications

Editor-in-Chief

ISSN (Print): 2666-2558
ISSN (Online): 2666-2566

General Research Article

Analysis of Univariate and Multivariate Filters Towards the Early Detection of Dementia

Author(s): Deepika Bansal, Kavita Khanna*, Rita Chhikara, Rakesh Kumar Dua and Rajeev Malhotra

Volume 15, Issue 4, 2022

Published on: 30 September, 2020

Article ID: e220322186427 Pages: 9

DOI: 10.2174/2666255813999200930163857

Price: $65

Abstract

Objective: Dementia is a progressive neurodegenerative brain disease emerging as a global health problem in adults aged 65 years or above, resulting in the death of nerve cells. The elimination of redundant and irrelevant features from the datasets is however necessary for accurate detection thus timely treatment of dementia.

Methods: For this purpose, an ensemble approach of univariate and multivariate feature selection methods has been proposed in this study. A comparison of four univariate feature selection techniques (t-Test, Wilcoxon, Entropy and ROC) and six multivariate feature selection approaches (ReliefF, Bhattacharyya, CFSSubsetEval, ClassifierAttributeEval, CorrelationAttributeEval, OneRAttributeEval) has been performed. The ensemble of best univariate & multivariate filter algorithms is proposed which helps in acquiring a subset of features that includes only relevant and non-redundant features. The classification is performed using Naïve Bayes, k-NN, and Random Forest algorithms.

Results: Experimental results show that t-Test and ReliefF feature selection is capable of selecting 10 relevant features that give the same accuracy when all features are considered. In addition to it, the accuracy obtained using k-NN with an ensemble approach is 99.96%. The statistical significance of the method has been established using Friedman’s statistical test.

Conclusion: The new ranking criteria computed by the ensemble method efficiently eliminate the insignificant features and reduces the computational cost of the algorithm. The ensemble method has been compared to the other approaches for ensuring the superiority of the proposed model.

Discussion: The percentage gain in accuracy for all three classifiers, Naïve Bayes, k-NN, and Random Forest shows a remarkable difference noted down for the percentage gain in the accuracies after applying feature selection using Naïve Bayes and k-NN. Using univariate filter selection methods, the t-test is outshining among all the methods while selecting only 10 feature subsets.

Keywords: Dementia, machine learning, feature selection, univariate filters, multivariate filters, classification accuracy.

Graphical Abstract

[3]
I. Guyon, and A. Elisseeff, "An introduction to vsariable and feature selection", J. Mach. Learn. Res., vol. 3, pp. 1157-1182, Mar 2003.
[4]
S.I.R. Bhagyashree, K. Nagaraj, M. Prince, C.H.D. Fall, and M. Krishna, "Diagnosis of dementia by machine learning methods in Epidemiological studies: A pilot exploratory study from south India", Soc. Psychiatry Psychiatr. Epidemiol., vol. 53, no. 1, pp. 77-86, 2018.
[http://dx.doi.org/10.1007/s00127-017-1410-0] [PMID: 28698926]
[5]
SR, "An approach to preprocess data in the diagnosis of Alzheimer’s disease", Proceedings of 2014 International Conference on Cloud Computing and Internet of Things, 2014pp. 135-139
[6]
S.B. Shree, and H.S. Sheshadri, An initial investigation in the diagnosis of Alzheimer’s disease using various classification techniquesIn Computational Intelligence and Computing Research (ICCIC), 2014 IEEE International Conference on, 2014, pp. 1-5.
[http://dx.doi.org/10.1109/ICCIC.2014.7238300]
[7]
H.S. Sheshadri, S.B. Shree, and M. Krishna, Diagnosis of alzheimer’s disease employing neuropsychological and classification techniques in 2015 5th International Conference on IT Convergence and Security (ICITCS), 2015, pp. 1-6.
[http://dx.doi.org/10.1109/ICITCS.2015.7292973]
[8]
L.J. Herrera, I. Rojas, H. Pomares, A. Guillén, O. Valenzuela, and O. Baños, "Classification of MRI images for alzheimer’s disease detection", in 2013 International Conference on Social Computing, 2013pp. 846-851
[http://dx.doi.org/10.1109/SocialCom.2013.127]
[9]
D. Bansal, R. Chhikara, K. Khanna, and P. Gupta, "Comparative analysis of various machine learning algorithms for detecting dementia", Procedia Comput. Sci., vol. 132, pp. 1497-1502, 2018.
[http://dx.doi.org/10.1016/j.procs.2018.05.102]
[10]
M.M. Dessouky, M.A. Elrashidy, and H.M. Abdelkader, "Selecting and extracting effective features for automated diagnosis of Alzheimer’s disease", Int. J. Comput. Appl., vol. 81, no. 4, pp. 17-24, 2013.
[11]
T.R. Sivapriya, A.R. Kamal, and P.R. Thangaiah, "Ensemble merit merge feature selection for enhanced multinomial classification in Alzheimer’s dementia", Comput. Math. Methods Med., vol. 2015, p. 676129, 2015.
[http://dx.doi.org/10.1155/2015/676129] [PMID: 26576199]
[12]
S.K. Aruna, and S. Chitra, "Machine learning approach for identifying dementia from MRI Images. World academy of science, engineering and technology, international journal of computer, electrical, automation", Cont. Informat. Eng., vol. 9, no. 3, pp. 881-888, 2016.
[13]
J.A. Williams, A. Weakley, D.J. Cook, and M. Schmitter-Edgecombe, Machine learning techniques for diagnostic differentiation of mild cognitive impairment and dementiaWorkshops at the twenty-seventh AAAI conference on artificial intelligence, 2013, pp. 71-76.
[14]
C.L. Chi, W. Oh, and S. Borson, "Feasibility study of a machine learning approach to predict dementia progression", in 2015 International Conference on Healthcare Informatics, 2015pp. 450-450
[http://dx.doi.org/10.1109/ICHI.2015.68]
[15]
D. Bansal, K. Khanna, R. Chhikara, R.K. Dua, and R. Malhotra, "Classification of Magnetic Resonance Images using Bag of Features for Detecting Dementia", Procedia Comput. Sci., vol. 167, pp. 131-137, 2020.
[http://dx.doi.org/10.1016/j.procs.2020.03.190]
[16]
D. Bansal, K. Khanna, R. Chhikara, R.K. Dua, and R. Malhotra, Analysis of classification and feature selection techniques for detecting dementia., SSRN Electron. J., 2019.
[http://dx.doi.org/10.2139/ssrn.3356886]
[17]
Y. Gupta, K.H. Lee, K.Y. Choi, J.J. Lee, B.C. Kim, and G.R. Kwon, "Alzheimer’s disease diagnosis based on cortical and subcortical features", J. Healthc. Eng., vol. 2019, p. 2492719, 2019.
[http://dx.doi.org/10.1155/2019/2492719] [PMID: 30944718]
[18]
C. Geetha, and D. Pugazhenthi, "Classification of alzheimer's disease subjects from MRI using fuzzy neural network with feature extraction using discrete wavelet transform", Biomed. Res. (Aligarh), 2018.
[19]
M. Mir, M. Dayyani, T. Sutikno, M. Mohammadi Zanjireh, and N. Razmjooy, "Employing a gaussian particle swarm optimization method for tuning multi input multi output‐fuzzy system as an integrated controller of a micro‐grid with stability analysis", Comput. Intell., vol. 36, no. 1, pp. 225-258, 2020.
[http://dx.doi.org/10.1111/coin.12257]
[20]
N. Razmjooy, M. Ramezani, and N. Ghadimi, "Imperialist competitive algorithm-based optimization of neuro-fuzzy system parameters for automatic red-eye removal", Int. J. Fuzzy Syst., vol. 19, no. 4, pp. 1144-1156, 2017.
[http://dx.doi.org/10.1007/s40815-017-0305-2]
[21]
D. Bansal, K. Khanna, R. Chhikara, R.K. Dua, and R. Malhotra, A study on dementia using machine learning techniquesProceedings of the 2nd International Conference on Communication and Computing Systems (ICCCS 2018), Gurgaon, India, 2019, p. 414.
[22]
S. Li, C. Liao, and J.T. Kwok, "Gene feature extraction using T-test statistics and kernel partial least squares", International Conference on Neural Information Processing, 2006pp. 11-20
[http://dx.doi.org/10.1007/11893295_2]
[23]
T. Sergios, Feature Selectionin Pattern Recognition, Fourth Edition Academic Press, 2009, p. Ch. 5, 261-289-110.
[24]
E.L. Lehmann, and H.J. D’Abrera, "Nonparametrics: statistical methods based on ranks", J. Am. Stat. Assoc., vol. 73, no. 364, p. 892, 1978.
[25]
K. Zeng, K. She, and X. Niu, "Feature selection with neighborhood entropy-based cooperative game theory", Comput. Intell. Neurosci., vol. 2014, p. 479289, 2014.
[http://dx.doi.org/10.1155/2014/479289] [PMID: 25276120]
[26]
M. de Figueiredo, C. Cordella, D. J. R. Bouveresse, X. Archer, J. M. Bégué, and D. Rutledge, The area under the ROC curve as a variable selection criterion for multiclass classification problems, 2018.
[27]
I. Kononenko, Estimating attributes: Analysis and extensions of RELIEF.In Machine Learning: ECML-94., Berlin, Heidelberg: Springer Berlin Heidelberg, 1994, pp. 171-182.
[http://dx.doi.org/10.1007/3-540-57868-4_57]
[28]
K. Kira, and L.A. Rendell, The feature selection problem: Traditional methods and a new algorithmIn Aaai, pp. 129-134, 1992.
[29]
A. Bhattacharyya, "On a measure of divergence between two multinomial populations", Sankhya, pp. 401-406, 1946.
[30]
K. Selvakuberan, M. Indradevi, and R. Rajaram, "Combined Feature Selection and classification–A novel approach for the categorization of web pages", J. Informat. Comput. Sci., vol. 3, no. 2, pp. 083-089, 2008.
[31]
ClassifierAttributeEval, http://weka.sourceforge.net/doc.packages/classifierBasedAttributeSelection/weka/attributeSelection/Classifier-AttributeEval.html
[32]
S. Gnanambal, M. Thangaraj, V.T. Meenatchi, and V. Gayathri, "Classification Algorithms with Attribute Selection: An evaluation study using WEKA", Int. J. Adv. Netw. Appl., vol. 9, no. 6, pp. 3640-3644, 2018.
[33]
OneRAttributeEval, http://weka.sourceforge.net/doc.dev/weka/attributeSelection/OneRAttributeEval.html
[34]
S.B. Kotsiantis, I. Zaharakis, and P. Pintelas, "Supervised machine learning: A review of classification techniques", Emerg. Art. Intell. Appl. Comput. Eng., vol. 160, no. 1, pp. 3-24, 2007.
[35]
G. Kou, Y. Lu, Y. Peng, and Y. Shi, "Evaluation of classification algorithms using MCDM and rank correlation", Int. J. Inf. Technol. Decis. Mak, vol. 11, no. 01, pp. 197-225, 2012.
[http://dx.doi.org/10.1142/S0219622012500095]
[36]
T. Mitchell, Machine Learning., MacGraw-Hill Companies. Inc.: Boston, 1997.
[37]
L. Breiman, "Random forests", Mach. Learn., vol. 45, no. 1, pp. 5-32, 2001.
[http://dx.doi.org/10.1023/A:1010933404324]
[38]
ADNI, Alzheimer’s Disease Neuroimaging Initiative..http://adni.loni.usc.edu/
[39]
M.F. Folstein, S.E. Folstein, and P.R. McHugh, "“Mini-mental state”. A practical method for grading the cognitive state of patients for the clinician", J. Psychiatr. Res., vol. 12, no. 3, pp. 189-198, 1975.
[http://dx.doi.org/10.1016/0022-3956(75)90026-6] [PMID: 1202204]
[40]
G. Kou, P. Yang, Y. Peng, F. Xiao, Y. Chen, and F.E. Alsaadi, "Evaluation of feature selection methods for text classification with small datasets using multiple criteria decision-making methods", Appl. Soft Comput., vol. 86, p. 105836, 2020.
[http://dx.doi.org/10.1016/j.asoc.2019.105836]
[41]
A. So, D. Hooshyar, K.W. Park, and H.S. Lim, "Early diagnosis of dementia from clinical data by machine learning techniques", Appl. Sci. (Basel), vol. 7, no. 7, p. 651, 2017.
[http://dx.doi.org/10.3390/app7070651]
[42]
X. Ding, M. Bucholc, H. Wang, D.H. Glass, H. Wang, D.H. Clarke, A.J. Bjourson, L.R.C. Dowey, M. O’Kane, G. Prasad, L. Maguire, and K. Wong-Lin, "A hybrid computational approach for efficient Alzheimer’s disease classification based on heterogeneous data", Sci. Rep., vol. 8, no. 1, p. 9774, 2018.
[http://dx.doi.org/10.1038/s41598-018-27997-8] [PMID: 29950585]
[43]
L.B. Moreira, and A.A. Namen, "A hybrid data mining model for diagnosis of patients with clinical suspicion of dementia", Comput. Methods Programs Biomed., vol. 165, pp. 139-149, 2018.
[http://dx.doi.org/10.1016/j.cmpb.2018.08.016] [PMID: 30337069]
[44]
A.A. Farid, G. Selim, and H. Khater, Applying Artificial Intelligence Techniques for Prediction of Neurodegenerative Disorders: A Comparative Case-Study on Clinical Tests and Neuroimaging Tests with Alzheimer’s Disease., Preprints, 2020.
[45]
J. Derrac, S. García, D. Molina, and F. Herrera, "A practical tutorial on the use of nonparametric statistical tests a methodology for comparing evolutionary and swarm intelligence algorithms", Swarm Evol. Comput., vol. 1, no. 1, pp. 3-18, 2011.
[http://dx.doi.org/10.1016/j.swevo.2011.02.002]
[46]
M. Friedman, "The use of ranks to avoid the assumption of normality implicit in the analysis of variance", J. Am. Stat. Assoc., vol. 32, no. 200, pp. 675-701, 1937.
[http://dx.doi.org/10.1080/01621459.1937.10503522]

Rights & Permissions Print Cite
© 2024 Bentham Science Publishers | Privacy Policy