Large language models (LLMs) have demonstrated great success in various fields, benefiting from their huge amount of parameters that store knowledge. However, LLMs still …
As a primary means of information acquisition, information retrieval (IR) systems, such as search engines, have integrated themselves into our daily lives. These systems also serve …
As one of the most advanced techniques in AI, Retrieval-Augmented Generation (RAG) can offer reliable and up-to-date external knowledge, providing huge convenience for numerous …
Y Lyu, Z Li, S Niu, F Xiong, B Tang, W Wang… - ACM Transactions on …, 2024 - dl.acm.org
Retrieval-Augmented Generation (RAG) is a technique that enhances the capabilities of large language models (LLMs) by incorporating external knowledge sources. This method …
X Wang, Z Wang, X Gao, F Zhang, Y Wu… - Proceedings of the …, 2024 - aclanthology.org
Retrieval-augmented generation (RAG) techniques have proven to be effective in integrating up-to-date information, mitigating hallucinations, and enhancing response quality …
Dense retrieval has become a prominent method to obtain relevant context or world knowledge in open-domain NLP tasks. When we use a learned dense retriever on a …
This survey presents an in-depth exploration of knowledge distillation (KD) techniques within the realm of Large Language Models (LLMs), spotlighting the pivotal role of KD in …
In this work, we introduce ChatQA, a suite of models that outperform GPT-4 on retrieval- augmented generation (RAG) and conversational question answering (QA). To enhance …
Despite efforts to expand the knowledge of large language models (LLMs), knowledge gaps- -missing or outdated information in LLMs--might always persist given the evolving nature of …