From Text to Transformation: A Comprehensive Review of Large Language Models' Versatility

P Kaur, GS Kashyap, A Kumar, MT Nafis… - arXiv preprint arXiv …, 2024 - arxiv.org
P Kaur, GS Kashyap, A Kumar, MT Nafis, S Kumar, V Shokeen
arXiv preprint arXiv:2402.16142, 2024arxiv.org
This groundbreaking study explores the expanse of Large Language Models (LLMs), such
as Generative Pre-Trained Transformer (GPT) and Bidirectional Encoder Representations
from Transformers (BERT) across varied domains ranging from technology, finance,
healthcare to education. Despite their established prowess in Natural Language Processing
(NLP), these LLMs have not been systematically examined for their impact on domains such
as fitness, and holistic well-being, urban planning, climate modelling as well as disaster …
This groundbreaking study explores the expanse of Large Language Models (LLMs), such as Generative Pre-Trained Transformer (GPT) and Bidirectional Encoder Representations from Transformers (BERT) across varied domains ranging from technology, finance, healthcare to education. Despite their established prowess in Natural Language Processing (NLP), these LLMs have not been systematically examined for their impact on domains such as fitness, and holistic well-being, urban planning, climate modelling as well as disaster management. This review paper, in addition to furnishing a comprehensive analysis of the vast expanse and extent of LLMs' utility in diverse domains, recognizes the research gaps and realms where the potential of LLMs is yet to be harnessed. This study uncovers innovative ways in which LLMs can leave a mark in the fields like fitness and wellbeing, urban planning, climate modelling and disaster response which could inspire future researches and applications in the said avenues.
arxiv.org
以上显示的是最相近的搜索结果。 查看全部搜索结果