Abstract
This study presents a multimodal fusion framework for emotion recognition
from physiological signals. In contrast to emotion recognition through facial
expression, a large number of emotions can be recognized accurately through
physiological signals. The DEAP database, a benchmark multimodal database with
many collections of EEG and peripheral signals, is employed for experimentation. The
proposed method takes into account those features that are subject-independent and can
incorporate many more emotions. As it is possible to handle many channel features,
especially synchronous EEG channels, feature-level fusion is applied in this study. The
features extracted from EEG and peripheral signals include relative, logarithmic, and
absolute power energy of Alpha, Beta, Gamma, Delta, and Theta. Experimental results
demonstrate that physiological signals' Theta and Beta bands are the most significant
contributor to the performance. On the other hand, SVM performs outstandingly.
About this chapter
Cite this chapter as:
Gyanendra K. Verma ;Multimodal Fusion Framework: Emotion Recognition From Physiological Signals, Multimodal Affective Computing: Affective Information Representation, Modelling, and Analysis (2023) 1: 115. https://doi.org/10.2174/9789815124453123010012
DOI https://doi.org/10.2174/9789815124453123010012 |
Publisher Name Bentham Science Publisher |