Gpt (generative pre-trained transformer)–a comprehensive review on enabling technologies, potential applications, emerging challenges, and future directions

G Yenduri, M Ramalingam, GC Selvi, Y Supriya… - IEEE …, 2024 - ieeexplore.ieee.org
The Generative Pre-trained Transformer (GPT) represents a notable breakthrough in the
domain of natural language processing, which is propelling us toward the development of …

[HTML][HTML] Decoding ChatGPT: a taxonomy of existing research, current challenges, and possible future directions

SS Sohail, F Farhat, Y Himeur, M Nadeem… - Journal of King Saud …, 2023 - Elsevier
Abstract Chat Generative Pre-trained Transformer (ChatGPT) has gained significant interest
and attention since its launch in November 2022. It has shown impressive performance in …

A categorical archive of chatgpt failures

A Borji - arXiv preprint arXiv:2302.03494, 2023 - arxiv.org
Large language models have been demonstrated to be valuable in different fields. ChatGPT,
developed by OpenAI, has been trained using massive amounts of data and simulates …

Large language models are state-of-the-art evaluators of translation quality

T Kocmi, C Federmann - arXiv preprint arXiv:2302.14520, 2023 - arxiv.org
We describe GEMBA, a GPT-based metric for assessment of translation quality, which works
both with a reference translation and without. In our evaluation, we focus on zero-shot …

Chatgpt beyond english: Towards a comprehensive evaluation of large language models in multilingual learning

VD Lai, NT Ngo, APB Veyseh, H Man… - arXiv preprint arXiv …, 2023 - arxiv.org
Over the last few years, large language models (LLMs) have emerged as the most important
breakthroughs in natural language processing (NLP) that fundamentally transform research …

[HTML][HTML] Hallucinations in large multilingual translation models

NM Guerreiro, DM Alves, J Waldendorf… - Transactions of the …, 2023 - direct.mit.edu
Hallucinated translations can severely undermine and raise safety issues when machine
translation systems are deployed in the wild. Previous research on the topic focused on …

Datasetdm: Synthesizing data with perception annotations using diffusion models

W Wu, Y Zhao, H Chen, Y Gu, R Zhao… - Advances in …, 2023 - proceedings.neurips.cc
Current deep networks are very data-hungry and benefit from training on large-scale
datasets, which are often time-consuming to collect and annotate. By contrast, synthetic data …

A comprehensive capability analysis of gpt-3 and gpt-3.5 series models

J Ye, X Chen, N Xu, C Zu, Z Shao, S Liu, Y Cui… - arXiv preprint arXiv …, 2023 - arxiv.org
GPT series models, such as GPT-3, CodeX, InstructGPT, ChatGPT, and so on, have gained
considerable attention due to their exceptional natural language processing capabilities …

Will affective computing emerge from foundation models and general artificial intelligence? A first evaluation of ChatGPT

MM Amin, E Cambria, BW Schuller - IEEE Intelligent Systems, 2023 - ieeexplore.ieee.org
ChatGPT has shown the potential of emerging general artificial intelligence capabilities, as it
has demonstrated competent performance across many natural language processing tasks …

Machine culture

L Brinkmann, F Baumann, JF Bonnefon… - Nature Human …, 2023 - nature.com
The ability of humans to create and disseminate culture is often credited as the single most
important factor of our success as a species. In this Perspective, we explore the notion of …