Abstract
Background: A number of disciplines, including security, healthcare, and human-machine interactions, have presented and used techniques for emotion recognition based on facial expressions.
Objective: To increase computer prediction, researchers are advancing the methods for deciphering code and extracting facial emotions.
Methods: The contamination of the image with noise, which alters the features of the images and ultimately impacts the accuracy of the system, is one of the major issues in this sector. Thus, noise should be eliminated or diminished. The wavelet transform approach is used in this study to denoise the images before categorization. The classification accuracies for original images are also obtained to analyze the effect of denoising on the classification accuracy of the facial expression images.
Results and Conclusion: Three machine learning approaches, support vector machine, k-nearest neighbor, and naive bayes, are utilized to classify the emotions in this instance. The feature employed is the histogram of directional gradients of images. The classification results are obtained and the effect of denoising on the classification accuracy of the facial expression images is analyzed. Also, our best-obtained result for the wavelet transform method is compared with other wavelet transform-based facial emotion recognition techniques. And our result is found to be promising.
Graphical Abstract
[http://dx.doi.org/10.5120/ijca2016908707]
[http://dx.doi.org/10.1016/j.procs.2020.07.101]
[http://dx.doi.org/10.1109/ICISC47916.2020.9171172]
[http://dx.doi.org/10.1134/S105466182103010X]
[http://dx.doi.org/10.21817/ijet/2017/v9i2/170902287]
[http://dx.doi.org/10.21512/commit.v11i2.3870]
[http://dx.doi.org/10.1016/j.eswa.2021.114885]
[http://dx.doi.org/10.18063/ieac.v1i1.770]
[http://dx.doi.org/10.3745/KIPSTB.2004.11B.7.849]
[http://dx.doi.org/10.1016/B978-0-12-809633-8.20473-1]
[http://dx.doi.org/10.1155/2017/9854050]
[http://dx.doi.org/10.36909/jer.15971]
[http://dx.doi.org/10.1145/3232651.3232664]
[http://dx.doi.org/10.17485/ijst/2016/v9i39/100789]
[http://dx.doi.org/10.1007/978-3-319-03092-0_17]
[http://dx.doi.org/10.1109/I2CT.2017.8226147]
[http://dx.doi.org/10.26555/ijain.v1i1.7]
[http://dx.doi.org/10.1504/IJMEI.2016.079358]