Text Generation aims to produce plausible and readable text in human language from input data. The resurgence of deep learning has greatly advanced this field, in particular, with the …
Large-scale pre-trained models (PTMs) such as BERT and GPT have recently achieved great success and become a milestone in the field of artificial intelligence (AI). Owing to …
Text retrieval is a long-standing research topic on information seeking, where a system is required to return relevant information resources to user's queries in natural language. From …
Training machine learning models in a meaningful order, from the easy samples to the hard ones, using curriculum learning can provide performance improvements over the standard …
X Cheng, D Luo, X Chen, L Liu… - Advances in Neural …, 2024 - proceedings.neurips.cc
With direct access to human-written reference as memory, retrieval-augmented generation has achieved much progress in a wide range of text generation tasks. Since better memory …
Instruction tuning is an emergent paradigm in NLP wherein natural language instructions are leveraged with language models to induce zero-shot performance on unseen tasks …
Most of the open-domain dialogue models tend to perform poorly in the setting of long-term human-bot conversations. The possible reason is that they lack the capability of …
In the rapidly evolving domain of artificial intelligence, chatbots have emerged as a potent tool for various applications ranging from e-commerce to healthcare. This research delves …
In an era where artificial intelligence (AI) is reshaping educational paradigms, this study explores AI-based chatbot adoption in higher education among students and educators …