S Lee,
H Hsu, CF Chen - arXiv preprint arXiv:2411.09689, 2024 - arxiv.org
LLM hallucination, where LLMs occasionally generate unfaithful text, poses significant
challenges for their practical applications. Most existing detection methods rely on external …