From Text to Transformation: A Comprehensive Review of Large Language Models' Versatility

P Kaur, GS Kashyap, A Kumar, MT Nafis… - arXiv preprint arXiv …, 2024 - arxiv.org
This groundbreaking study explores the expanse of Large Language Models (LLMs), such
as Generative Pre-Trained Transformer (GPT) and Bidirectional Encoder Representations …

A survey on large language models: Applications, challenges, limitations, and practical usage

MU Hadi, R Qureshi, A Shah, M Irfan, A Zafar… - Authorea …, 2023 - techrxiv.org
Within the vast expanse of computerized language processing, a revolutionary entity known
as Large Language Models (LLMs) has emerged, wielding immense power in its capacity to …

Large language models: a comprehensive survey of its applications, challenges, limitations, and future prospects

MU Hadi, R Qureshi, A Shah, M Irfan, A Zafar… - Authorea …, 2023 - techrxiv.org
Within the vast expanse of computerized language processing, a revolutionary entity known
as Large Language Models (LLMs) has emerged, wielding immense power in its capacity to …

An analysis of large language models: their impact and potential applications

G Bharathi Mohan, R Prasanna Kumar… - … and Information Systems, 2024 - Springer
Large language models (LLMs) have transformed the interpretation and creation of human
language in the rapidly developing field of computerized language processing. These …

A review on large Language Models: Architectures, applications, taxonomies, open issues and challenges

MAK Raiaan, MSH Mukta, K Fatema, NM Fahad… - IEEE …, 2024 - ieeexplore.ieee.org
Large Language Models (LLMs) recently demonstrated extraordinary capability in various
natural language processing (NLP) tasks including language translation, text generation …

What the [mask]? making sense of language-specific BERT models

D Nozza, F Bianchi, D Hovy - arXiv preprint arXiv:2003.02912, 2020 - arxiv.org
Recently, Natural Language Processing (NLP) has witnessed an impressive progress in
many areas, due to the advent of novel, pretrained contextual representation models. In …

A review of hybrid and ensemble in deep learning for natural language processing

J Jia, W Liang, Y Liang - arXiv preprint arXiv:2312.05589, 2023 - arxiv.org
This review presents a comprehensive exploration of hybrid and ensemble deep learning
models within Natural Language Processing (NLP), shedding light on their transformative …

Bangla-bert: transformer-based efficient model for transfer learning and language understanding

M Kowsher, AA Sami, NJ Prottasha, MS Arefin… - IEEE …, 2022 - ieeexplore.ieee.org
The advent of pre-trained language models has directed a new era of Natural Language
Processing (NLP), enabling us to create powerful language models. Among these models …

Advancing transformer architecture in long-context large language models: A comprehensive survey

Y Huang, J Xu, Z Jiang, J Lai, Z Li, Y Yao… - arXiv preprint arXiv …, 2023 - arxiv.org
With the bomb ignited by ChatGPT, Transformer-based Large Language Models (LLMs)
have paved a revolutionary path toward Artificial General Intelligence (AGI) and have been …

A mathematical interpretation of autoregressive generative pre-trained transformer and self-supervised learning

M Lee - Mathematics, 2023 - mdpi.com
In this paper, we present a rigorous mathematical examination of generative pre-trained
transformer (GPT) models and their autoregressive self-supervised learning mechanisms …