Artificial intelligence foundation and pre-trained models: Fundamentals, applications, opportunities, and social impacts

A Kolides, A Nawaz, A Rathor, D Beeman… - … Modelling Practice and …, 2023 - Elsevier
With the emergence of foundation models (FMs) that are trained on large amounts of data at
scale and adaptable to a wide range of downstream applications, AI is experiencing a …

Foundation models: a new paradigm for artificial intelligence

J Schneider, C Meske, P Kuss - Business & Information Systems …, 2024 - Springer
Recently, the domain of artificial intelligence (AI) has experienced a profound transformation
with the emergence of foundation models as a new paradigm for developing AI systems …

Augmenting research methods with foundation models and generative AI

S Rossi, M Rossi, RR Mukkamala, JB Thatcher… - International Journal of …, 2024 - Elsevier
Deep learning (DL) research has made remarkable progress in recent years. Natural
language processing and image generation have made the leap from computer science …

Model-as-a-service (MaaS): A survey

W Gan, S Wan, SY Philip - 2023 IEEE International Conference …, 2023 - ieeexplore.ieee.org
Due to the increased number of parameters and data in the pre-trained model exceeding a
certain level, a foundation model (eg, a large language model) can significantly improve …

Foundation models are platform models: Prompting and the political economy of AI

S Burkhardt, B Rieder - Big Data & Society, 2024 - journals.sagepub.com
A recent innovation in the field of machine learning has been the creation of very large pre-
trained models, also referred to as 'foundation models', that draw on much larger and …

Learn from model beyond fine-tuning: A survey

H Zheng, L Shen, A Tang, Y Luo, H Hu, B Du… - arXiv preprint arXiv …, 2023 - arxiv.org
Foundation models (FM) have demonstrated remarkable performance across a wide range
of tasks (especially in the fields of natural language processing and computer vision) …

Learnware: Small models do big

ZH Zhou, ZH Tan - Science China Information Sciences, 2024 - Springer
There are complaints about current machine learning techniques such as the requirement of
a huge amount of training data and proficient training skills, the difficulty of continual …

Foundation models for decision making: Problems, methods, and opportunities

S Yang, O Nachum, Y Du, J Wei, P Abbeel… - arXiv preprint arXiv …, 2023 - arxiv.org
Foundation models pretrained on diverse data at scale have demonstrated extraordinary
capabilities in a wide range of vision and language tasks. When such models are deployed …

A comprehensive survey on pretrained foundation models: A history from bert to chatgpt

C Zhou, Q Li, C Li, J Yu, Y Liu, G Wang… - arXiv preprint arXiv …, 2023 - arxiv.org
Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks with different data modalities. A PFM (eg, BERT, ChatGPT, and GPT-4) is …

[HTML][HTML] Pre-trained models: Past, present and future

X Han, Z Zhang, N Ding, Y Gu, X Liu, Y Huo, J Qiu… - AI Open, 2021 - Elsevier
Large-scale pre-trained models (PTMs) such as BERT and GPT have recently achieved
great success and become a milestone in the field of artificial intelligence (AI). Owing to …