A comprehensive survey on pretrained foundation models: A history from bert to chatgpt

C Zhou, Q Li, C Li, J Yu, Y Liu, G Wang… - arXiv preprint arXiv …, 2023 - arxiv.org
Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks with different data modalities. A PFM (eg, BERT, ChatGPT, and GPT-4) is …

A Survey on Self-supervised Learning: Algorithms, Applications, and Future Trends

J Gui, T Chen, J Zhang, Q Cao, Z Sun… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Deep supervised learning algorithms typically require a large volume of labeled data to
achieve satisfactory performance. However, the process of collecting and labeling such data …

Self-supervised learning from images with a joint-embedding predictive architecture

M Assran, Q Duval, I Misra… - Proceedings of the …, 2023 - openaccess.thecvf.com
This paper demonstrates an approach for learning highly semantic image representations
without relying on hand-crafted data-augmentations. We introduce the Image-based Joint …

Robust and data-efficient generalization of self-supervised machine learning for diagnostic imaging

S Azizi, L Culp, J Freyberg, B Mustafa, S Baur… - Nature Biomedical …, 2023 - nature.com
Abstract Machine-learning models for medical tasks can match or surpass the performance
of clinical experts. However, in settings differing from those of the training dataset, the …

Generalizing to unseen domains: A survey on domain generalization

J Wang, C Lan, C Liu, Y Ouyang, T Qin… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
Machine learning systems generally assume that the training and testing distributions are
the same. To this end, a key requirement is to develop models that can generalize to unseen …

Vicregl: Self-supervised learning of local visual features

A Bardes, J Ponce, Y LeCun - Advances in Neural …, 2022 - proceedings.neurips.cc
Most recent self-supervised methods for learning image representations focus on either
producing a global feature with invariance properties, or producing a set of local features …

Self-supervised learning with data augmentations provably isolates content from style

J Von Kügelgen, Y Sharma, L Gresele… - Advances in neural …, 2021 - proceedings.neurips.cc
Self-supervised representation learning has shown remarkable success in a number of
domains. A common practice is to perform data augmentation via hand-crafted …

Provable guarantees for self-supervised deep learning with spectral contrastive loss

JZ HaoChen, C Wei, A Gaidon… - Advances in Neural …, 2021 - proceedings.neurips.cc
Recent works in self-supervised learning have advanced the state-of-the-art by relying on
the contrastive learning paradigm, which learns representations by pushing positive pairs, or …

Self-supervised learning: Generative or contrastive

X Liu, F Zhang, Z Hou, L Mian, Z Wang… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Deep supervised learning has achieved great success in the last decade. However, its
defects of heavy dependence on manual labels and vulnerability to attacks have driven …

Less data, more knowledge: Building next generation semantic communication networks

C Chaccour, W Saad, M Debbah… - … Surveys & Tutorials, 2024 - ieeexplore.ieee.org
Semantic communication is viewed as a revolutionary paradigm that can potentially
transform how we design and operate wireless communication systems. However, despite a …