Multimodal Affective Computing: Affective Information Representation, Modelling, and Analysis

Multimodal Fusion Framework: Emotion Recognition From Physiological Signals

Author(s): Gyanendra K. Verma * .

Pp: 115-127 (13)

DOI: 10.2174/9789815124453123010012

* (Excluding Mailing and Handling)

Abstract

This study presents a multimodal fusion framework for emotion recognition from physiological signals. In contrast to emotion recognition through facial expression, a large number of emotions can be recognized accurately through physiological signals. The DEAP database, a benchmark multimodal database with many collections of EEG and peripheral signals, is employed for experimentation. The proposed method takes into account those features that are subject-independent and can incorporate many more emotions. As it is possible to handle many channel features, especially synchronous EEG channels, feature-level fusion is applied in this study. The features extracted from EEG and peripheral signals include relative, logarithmic, and absolute power energy of Alpha, Beta, Gamma, Delta, and Theta. Experimental results demonstrate that physiological signals' Theta and Beta bands are the most significant contributor to the performance. On the other hand, SVM performs outstandingly.

© 2024 Bentham Science Publishers | Privacy Policy