Abstract
Background: Early detection of mild cognitive impairment is crucial in the prevention of Alzheimer’s disease. The aim of the present study was to identify whether acoustic features can help differentiate older, independent community-dwelling individuals with cognitive impairment from healthy controls.
Methods: A total of 8779 participants (mean age 74.2 ± 5.7 in the range of 65-96, 3907 males and 4872 females) with different cognitive profiles, namely healthy controls, mild cognitive impairment, global cognitive impairment (defined as a Mini Mental State Examination score of 20-23), and mild cognitive impairment with global cognitive impairment (a combined status of mild cognitive impairment and global cognitive impairment), were evaluated in short-sentence reading tasks, and their acoustic features, including temporal features (such as duration of utterance, number and length of pauses) and spectral features (F0, F1, and F2), were used to build a machine learning model to predict their cognitive impairments.
Results: The classification metrics from the healthy controls were evaluated through the area under the receiver operating characteristic curve and were found to be 0.61, 0.67, and 0.77 for mild cognitive impairment, global cognitive impairment, and mild cognitive impairment with global cognitive impairment, respectively.
Conclusion: Our machine learning model revealed that individuals’ acoustic features can be employed to discriminate between healthy controls and those with mild cognitive impairment with global cognitive impairment, which is a more severe form of cognitive impairment compared with mild cognitive impairment or global cognitive impairment alone. It is suggested that language impairment increases in severity with cognitive impairment.
Keywords: Mild cognitive impairment, global cognitive impairment, acoustic analysis, speech, sentence reading, machine learning.
[http://dx.doi.org/10.1016/0022-3956(75)90026-6] [PMID: 1202204]
[http://dx.doi.org/10.1111/j.1532-5415.2005.53221.x] [PMID: 15817019]
[http://dx.doi.org/10.1212/WNL.45.12.2165] [PMID: 8848186]
[http://dx.doi.org/10.3109/17549500903137256] [PMID: 20380247]
[http://dx.doi.org/10.1016/j.nrleng.2012.07.017] [PMID: 23046975]
[http://dx.doi.org/10.1016/j.dadm.2014.11.012] [PMID: 27239498]
[http://dx.doi.org/10.3389/fneur.2018.00975] [PMID: 30498472]
[http://dx.doi.org/10.1017/S1041610213000781] [PMID: 23742823]
[http://dx.doi.org/10.1159/000235649] [PMID: 19696488]
[http://dx.doi.org/10.1111/ggi.13259] [PMID: 29380503]
[http://dx.doi.org/10.1111/ggi.12504] [PMID: 25953032]
[http://dx.doi.org/10.1186/s13195-019-0480-5] [PMID: 30867057]
[PMID: 29405232]
[PMID: 3611032]
[http://dx.doi.org/10.1111/ggi.12014] [PMID: 23230988]
[http://dx.doi.org/10.1001/archneur.65.7.963] [PMID: 18625866]
[http://dx.doi.org/10.1371/journal.pone.0158720] [PMID: 27415430]
[http://dx.doi.org/10.1006/brln.2000.2330] [PMID: 10924216]
[http://dx.doi.org/10.1016/j.neuroimage.2017.02.003]
[http://dx.doi.org/10.1371/journal.pone.0032132] [PMID: 22389682]
[http://dx.doi.org/10.1159/000350808] [PMID: 24022211]
[PMID: 19330040]
[http://dx.doi.org/10.1016/j.patrec.2005.10.010]
[http://dx.doi.org/10.1073/pnas.102102699] [PMID: 11983868]
[http://dx.doi.org/10.3233/JAD-170826] [PMID: 29480180]
[http://dx.doi.org/10.1093/geronj/40.5.579] [PMID: 4031406]
[http://dx.doi.org/10.1093/geronj/48.5.P245] [PMID: 8366270]
[http://dx.doi.org/10.1037/h0043220] [PMID: 13367264]
[http://dx.doi.org/10.1001/archneur.58.12.1985] [PMID: 11735772]
[http://dx.doi.org/10.3389/fnagi.2017.00437] [PMID: 29375365]
[http://dx.doi.org/10.1126/science.3287615] [PMID: 3287615]
[http://dx.doi.org/10.2174/1567205014666171121114930] [PMID: 29165085]