Cped: A large-scale chinese personalized and emotional dialogue dataset for conversational ai

Y Chen, W Fan, X Xing, J Pang, M Huang… - arXiv preprint arXiv …, 2022 - arxiv.org
Human language expression is based on the subjective construal of the situation instead of
the objective truth conditions, which means that speakers' personalities and emotions after …

Dialogstudio: Towards richest and most diverse unified dataset collection for conversational ai

J Zhang, K Qian, Z Liu, S Heinecke, R Meng… - arXiv preprint arXiv …, 2023 - arxiv.org
Despite advancements in conversational AI, language models encounter challenges to
handle diverse conversational tasks, and existing dialogue dataset collections often lack …

Eva: An open-domain chinese dialogue system with large-scale generative pre-training

H Zhou, P Ke, Z Zhang, Y Gu, Y Zheng, C Zheng… - arXiv preprint arXiv …, 2021 - arxiv.org
Although pre-trained language models have remarkably enhanced the generation ability of
dialogue systems, open-domain Chinese dialogue systems are still limited by the dialogue …

Building a dialogue corpus annotated with expressed and experienced emotions

T Ide, D Kawahara - arXiv preprint arXiv:2205.11867, 2022 - arxiv.org
In communication, a human would recognize the emotion of an interlocutor and respond
with an appropriate emotion, such as empathy and comfort. Toward developing a dialogue …

Think twice: A human-like two-stage conversational agent for emotional response generation

Y Qian, B Wang, S Ma, W Bin, S Zhang, D Zhao… - arXiv preprint arXiv …, 2023 - arxiv.org
Towards human-like dialogue systems, current emotional dialogue approaches jointly
model emotion and semantics with a unified neural network. This strategy tends to generate …

Learning retrieval augmentation for personalized dialogue generation

Q Huang, S Fu, X Liu, W Wang, T Ko, Y Zhang… - arXiv preprint arXiv …, 2024 - arxiv.org
Personalized dialogue generation, focusing on generating highly tailored responses by
leveraging persona profiles and dialogue context, has gained significant attention in …

Plato-xl: Exploring the large-scale pre-training of dialogue generation

S Bao, H He, F Wang, H Wu, H Wang, W Wu… - arXiv preprint arXiv …, 2021 - arxiv.org
To explore the limit of dialogue generation pre-training, we present the models of PLATO-XL
with up to 11 billion parameters, trained on both Chinese and English social media …

Pangu-bot: Efficient generative dialogue pre-training from pre-trained language model

F Mi, Y Li, Y Zeng, J Zhou, Y Wang, C Xu… - arXiv preprint arXiv …, 2022 - arxiv.org
In this paper, we introduce PanGu-Bot, a Chinese pre-trained open-domain dialogue
generation model based on a large pre-trained language model (PLM) PANGU-alpha (Zeng …

Chatplug: Open-domain generative dialogue system with internet-augmented instruction tuning for digital human

J Tian, H Chen, G Xu, M Yan, X Gao, J Zhang… - arXiv preprint arXiv …, 2023 - arxiv.org
In this paper, we present ChatPLUG, a Chinese open-domain dialogue system for digital
human applications that instruction finetunes on a wide range of dialogue tasks in a unified …

EmotionX-IDEA: Emotion BERT--an Affectional Model for Conversation

YH Huang, SR Lee, MY Ma, YH Chen, YW Yu… - arXiv preprint arXiv …, 2019 - arxiv.org
In this paper, we investigate the emotion recognition ability of the pre-training language
model, namely BERT. By the nature of the framework of BERT, a two-sentence structure, we …