Repository logo
Institutional Digital Repository
Shreenivas Deshpande Library, IIT (BHU), Varanasi

Deep Learning-Based Automated Emotion Recognition Using Multimodal Physiological Signals and Time-Frequency Methods

Loading...
Thumbnail Image

Date

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Accurate prediction and recognition of human emotions are crucial for effective human-computer interfaces. An automated emotion recognition (AER) method is highly desirable, and multimodal approaches have gained scientific attention due to their ability to leverage different modalities for improved accuracy and reliability. Our study has attempted to classify multiple emotional states using multimodal physiological signals, namely electrodermal activity (EDA), electrocardiogram (ECG), and various deep-learning (DL) networks. The signals were obtained from the publicly available two datasets such as continuously annotated signals of emotion (CASE) and wearable stress and affect detection (WESAD). The EDA signals were decomposed into tonic and phasic components using the convex-optimization-based EDA method. Further, the ECG and phasic components of EDA were subjected to different time-frequency representations (TFR), namely short-time Fourier transform, continuous wavelet transform (CWT), and mel-frequency cepstrum (MFC). These TFRs were fed as inputs to the AlexNet, configurable convolutional neural network (cCNN), and pretrained VGG16 architectures in unimodal and multimodal settings to extract robust features for the effective classification. Our results demonstrate that multimodal signals outperform unimodal settings, with ECG proving more effective than EDA in emotion classification. The pipeline (EDA + ECG, CWT, and VGG16) yielded the highest accuracies of 86.66% and 83.96% for four-class and three-class emotion classification, respectively. In conclusion, our results suggest that network performance was improved with multimodal physiological signals compared to unimodal signals in accurate AER. Consequently, the proposed method holds promise as a valuable tool for assessing emotional states in automated decision-making. © 1963-2012 IEEE.

Description

Keywords

Citation

Collections

Endorsement

Review

Supplemented By

Referenced By