The research in classifying affective states of a participant provided a great amount of feature extraction methods in several modalities like facial motion, speech, biophysiological signals and Action Units (AU). The ability of predicting the heart rate of a participant with remote Photoplethysmography (rPPG) from the video channel enables an interesting modality for classification of affective states but only few authors tried it. In this work, we present the rPPG signal as a new modality for pain classification and evaluate the benefit of a fusion with other modalities. In short the rPPG signal is filtered in multiple frequency ranges corresponding to the respiration rate as biophysiological signal. Then the pain is classified by fusing all modalities with a hierarchical fusion architecture. The performance could be increased around ~1.4% with the rPPG signal even in combination with biophysiological signals from a biosignal amplifier.