Numerous researchers in the field of automatic human emotion recognition have shown that physiological responses in humans are influenced by emotional changes. According to several previously conducted studies, the physiological data used are collected by sensors placed in the human body. In recent years, researchers have shown that the human heart rate can be estimated from human face videos. This technique is based on extracting the PPG signal by measuring the red–greenblue (RGB) color changes of the face of a person during the cardiac cycle. In this respect, a new framework based on contactless PPG signals for classifying human emotions is proposed in the present study. The experiment conducted in the present work used a widely popular emotional database, i.e. the MAHNOB-HCI database, to evaluate some PPG signal extraction methods. Regarding classification, a deep learning architecture, which combines a one-dimensional convolution neural network (1DCNN) and a long short-term memory (LSTM), after applying the normalization and signal segmentation steps, was adopted. It is important to emphasize that the method adopted here achieved a recognition rate of 73.33% and 60% for the binary classification of valence and arousal, respectively, with a signal segmentation of 4 s. It is worth recalling that the present paper aims primarily to propose a new study approach on automatic emotion recognition using deep learning techniques that are based on contactless sensing of physiological signals.
© 2022 The Authors. Published by Elsevier B.V.