Multimodal Deep Learning Model for Subject-Independent EEG-based Emotion Recognition

SY Dharia, CE Valderrama… - 2023 IEEE Canadian …, 2023 - ieeexplore.ieee.org
2023 IEEE Canadian Conference on Electrical and Computer …, 2023ieeexplore.ieee.org
Regulating emotion is crucial for maintaining well-being and social relationships. However,
as we age, the volume of the frontal lobes is reduced, which can cause difficulties in
regulating emotions. Electroencephalography (EEG)-based emotion recognition has the
potential to understand the complexity of human emotions and the atrophy of the frontal
lobes that leads to cognitive impairment. In this study, we investigated a multimodal deep
learning approach for subject-independent emotion recognition using EEG and eye …
Regulating emotion is crucial for maintaining well-being and social relationships. However, as we age, the volume of the frontal lobes is reduced, which can cause difficulties in regulating emotions. Electroencephalography (EEG)-based emotion recognition has the potential to understand the complexity of human emotions and the atrophy of the frontal lobes that leads to cognitive impairment. In this study, we investigated a multimodal deep learning approach for subject-independent emotion recognition using EEG and eye movement data. To that end, we proposed an attention mechanism layer to fuse features extracted from the EEG and eye movement data. We tested our approach in two benchmarking emotion recognition datasets: SEED-IV and SEED-V. Our approach achieved an average accuracy of 67.3% and 72.3% for SEED-IV and SEED-V, respectively. Our results demonstrate the potential of multimodal deep learning models for subject-independent emotion recognition using EEG and eye movement data, which can have important implications for assessing emotional regulation in clinical and research settings.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果