In this paper, we developed and integrated an AI-edge emotion recognition platform using multiple wearable physiological signals sensors: Electroencephalogram (EEG), electrocardiogram (ECG), and photoplethysmogram (PPG) sensors. The emotion recognition platform used two combined machine learning approaches based on two systems input and preprocessing: An EEG-based emotion recognition system and an ECG/PPG-based system. The EEG-based system is a convolution neural network (CNN) that classifies three emotions, happiness, anger and sadness. The inputs of the CNN are extracted from the EEG signals using short-time Fourier transform (STFT), and the average accuracy for a subject-independent classification reached 76.94%. The ECG/PPG-based system used a similar CNN with an extracted features vector as input. The subject-dependent ECG/PPG classification system reached an average accuracy of 76.8%. The proposed system was integrated using the RISC-V processor and FPGA platforms to implement realtime monitoring and classification on edge. A 3-to-1 Bluetooth piconet was deployed to transmit all physiological signals on a single platform access point and to make use of low power wireless technologies.