Abstract
Background: Facial emotion recognition (FER) technology is enumerated as a productive interface in several operations, which has been specifically focused on as a substitute communication path among a user and an appliance for human computer interface in the previous decade. The efficiency of the facial identification model straightaway relies on the capability of classification methods. In addition, an appropriate swap between recognition efficiency and computational cost is reckoned as the most important factor for planning such models.
Methods: The objective of this paper was to classify the facial emotion electromyogram (EMG) signals by means of a neural network algorithm (NN), support vector machine (SVM) algorithm, and Naive-Bayes algorithm. This research work was directed towards the correlation among the classification accuracies by applying distinct feature extraction procedures on fEMGs. At first, eight participants (six male and two female) were recruited for data recording. Four electrodes were placed on each participant's face for capturing facial gestures (happy, angry, sad, and fear) and two electrodes were placed on the wrist for grounding purposes. Data were recorded by using BIOPAC MP150. After this, the signals were filtered using a band-pass filter and segmentation techniques for enhanced processing. After that, the time-domain and frequency-domain feature extraction procedures were carried out. Time domain and frequency domain features were applied to recorded signals. In this research, we used LabVIEW and MATLAB to produce a set of characteristics from fEMG signals for four emotional conditions, such as anger, sad, fear, and happy. After the feature extraction process, the extracted features were aligned into respective emotions by applying classifiers. The extracted features were further trained and classified by applying the SVM classifier, neural network classifier, and Naive Bayes classifier in MATLAB 2020.
Results: The SVM classifier and neural network classifier generated an accuracy of 93.80% and 96.90%, respectively, whereas the Naive Bayes classifier generated an accuracy of 90.60%.
Conclusion: Facial emotion recognition (FER) is foresighted as a progressive or futuristic model, which has attracted the attention of researchers in several areas of learning due to its higher prospects in distinct applications. Acknowledgment of the emotions through biomedical signals produced from movements of facial muscles is lately presented using an explicit and authentic route.
[http://dx.doi.org/10.1615/CritRevBiomedEng.v30.i456.80] [PMID: 12739757]
[http://dx.doi.org/10.1080/10739149.2014.913178]
[http://dx.doi.org/10.1016/j.measurement.2014.10.023]
[http://dx.doi.org/10.1109/ICASSP.1983.1172264]
[http://dx.doi.org/10.2174/1872212116666220518122621]
[http://dx.doi.org/10.1109/TAFFC.2016.2569098]
[http://dx.doi.org/10.1016/j.asoc.2015.01.034]
[http://dx.doi.org/10.1007/s11062-015-9537-7]
[http://dx.doi.org/10.1080/02533839.2013.799946]
[http://dx.doi.org/10.1109/CIT/IUCC/DASC/PICOM.2015.148]
[http://dx.doi.org/10.3109/03091902.2016.1153739] [PMID: 27004618]
[http://dx.doi.org/10.1016/j.neucom.2016.12.038]
[http://dx.doi.org/10.1016/j.ijleo.2015.10.072]
[http://dx.doi.org/10.1016/j.neuropsychologia.2010.11.005] [PMID: 21075127]
[http://dx.doi.org/10.1109/ICNTE44896.2019.8945843]
[http://dx.doi.org/10.1504/IJBET.2014.065657]
[http://dx.doi.org/10.1251/bpo115]
[http://dx.doi.org/10.1109/TPAMI.2016.2577031] [PMID: 27295650]
[http://dx.doi.org/10.1504/IJMEI.2016.079358]
[http://dx.doi.org/10.1080/026999396380321]
[http://dx.doi.org/10.1109/CBS.2018.8612213]
[http://dx.doi.org/10.1109/FSKD.2010.5569532]
[http://dx.doi.org/10.1109/DICTA.2007.4426768]
[http://dx.doi.org/10.1109/TENCON.2004.1414843]
[http://dx.doi.org/10.1145/1738826.1738914]
[http://dx.doi.org/10.3390/technologies6010017]
[http://dx.doi.org/10.3390/sym11101189]
[http://dx.doi.org/10.1111/j.1469-8986.2006.00451.x] [PMID: 16965606]
[http://dx.doi.org/10.5772/52556]
[http://dx.doi.org/10.1109/ICME.2005.1521579]
[http://dx.doi.org/10.1109/HSI.2013.6577880]
[http://dx.doi.org/10.1109/ACII.2017.8273589]
[http://dx.doi.org/10.1007/978-3-319-19695-4_26]
[http://dx.doi.org/10.1016/S1350-4533(99)00055-7] [PMID: 10576421]
[http://dx.doi.org/10.1017/CBO9780511794797]
[http://dx.doi.org/10.2478/v10048-012-0015-8]
[http://dx.doi.org/10.1016/0165-0114(94)90297-6]
[http://dx.doi.org/10.1088/1741-2560/4/2/R01] [PMID: 17409472]
[http://dx.doi.org/10.1186/1743-0003-10-75] [PMID: 23855907]
[http://dx.doi.org/10.1016/j.jelekin.2008.09.007] [PMID: 19027325]
[http://dx.doi.org/10.32474/OAJBEB.2018.01.000104]
[http://dx.doi.org/10.1109/TBME.2008.2005950] [PMID: 19224732]