Abstract
This study presents a multimodal fusion framework for emotion recognition
from physiological signals. In contrast to emotion recognition through facial
expression, a large number of emotions can be recognized accurately through
physiological signals. The DEAP database, a benchmark multimodal database with
many collections of EEG and peripheral signals, is employed for experimentation. The
proposed method takes into account those features that are subject-independent and can
incorporate many more emotions. As it is possible to handle many channel features,
especially synchronous EEG channels, feature-level fusion is applied in this study. The
features extracted from EEG and peripheral signals include relative, logarithmic, and
absolute power energy of Alpha, Beta, Gamma, Delta, and Theta. Experimental results
demonstrate that physiological signals' Theta and Beta bands are the most significant
contributor to the performance. On the other hand, SVM performs outstandingly.