Z He,
Z Wang, X Liu, S Liu, Y Yao, Y Huang… - arXiv preprint arXiv …, 2024 - arxiv.org
In this technical report, we present TeleChat, a collection of large language models (LLMs)
with parameters of 3 billion, 7 billion and 12 billion. It includes pretrained language models …