An efficient LSTM network for emotion recognition from multichannel EEG signals

X Du, C Ma, G Zhang, J Li, YK Lai… - IEEE Transactions …, 2020 - ieeexplore.ieee.org
Most previous EEG-based emotion recognition methods studied hand-crafted EEG features
extracted from different electrodes. In this article, we study the relation among different EEG …

[HTML][HTML] A dataset of continuous affect annotations and physiological signals for emotion analysis

K Sharma, C Castellini, EL Van Den Broek… - Scientific data, 2019 - nature.com
From a computational viewpoint, emotions continue to be intriguingly hard to understand. In
research, a direct and real-time inspection in realistic settings is not possible. Discrete …

Emotional experience in uncomfortable indoor environments: A combined examination of personal factors

H Kim, T Hong - Building and Environment, 2023 - Elsevier
Emotional experience is a critical element in human perception and interaction with the
social environment. These experiences, while partially determined by internal human …

Rcea: Real-time, continuous emotion annotation for collecting precise mobile video ground truth labels

T Zhang, A El Ali, C Wang, A Hanjalic… - Proceedings of the 2020 …, 2020 - dl.acm.org
Collecting accurate and precise emotion ground truth labels for mobile video watching is
essential for ensuring meaningful predictions. However, video-based emotion annotation …

Few-shot learning for fine-grained emotion recognition using physiological signals

T Zhang, A El Ali, A Hanjalic… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Fine-grained emotion recognition can model the temporal dynamics of emotions, which is
more precise than predicting one emotion retrospectively for an activity (eg, video clip …

Rcea-360vr: Real-time, continuous emotion annotation in 360 vr videos for collecting precise viewport-dependent ground truth labels

T Xue, A El Ali, T Zhang, G Ding, P Cesar - Proceedings of the 2021 CHI …, 2021 - dl.acm.org
Precise emotion ground truth labels for 360° virtual reality (VR) video watching are essential
for fine-grained predictions under varying viewing behavior. However, current annotation …

Weakly-supervised learning for fine-grained emotion recognition using physiological signals

T Zhang, A El Ali, C Wang, A Hanjalic… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Instead of predicting just one emotion for one activity (eg, video watching), fine-grained
emotion recognition enables more temporally precise recognition. Previous works on fine …

[HTML][HTML] Effect of sound sequence on soundscape emotions

Z Han, J Kang, Q Meng - Applied Acoustics, 2023 - Elsevier
This study analysed the effect of sound sequence on soundscape emotions with respect to
three aspects of sound sources: the number of sound source/s, changing trends in the …

Investigating the relationship between momentary emotion self-reports and head and eye movements in hmd-based 360 vr video watching

T Xue, AE Ali, G Ding, P Cesar - Extended abstracts of the 2021 CHI …, 2021 - dl.acm.org
Inferring emotions from Head Movement (HM) and Eye Movement (EM) data in 360° Virtual
Reality (VR) can enable a low-cost means of improving users' Quality of Experience …

[HTML][HTML] Validation and application of the Non-Verbal Behavior Analyzer: An automated tool to assess non-verbal emotional expressions in psychotherapy

P Terhürne, B Schwartz, T Baur, D Schiller… - Frontiers in …, 2022 - frontiersin.org
Background Emotions play a key role in psychotherapy. However, a problem with examining
emotional states via self-report questionnaires is that the assessment usually takes place …