How far pre-trained models are from neural collapse on the target dataset informs their transferability

Z Wang, Y Luo, L Zheng, Z Huang… - Proceedings of the …, 2023 - openaccess.thecvf.com
This paper focuses on model transferability estimation, ie, assessing the performance of pre-
trained models on a downstream task without performing fine-tuning. Motivated by the …

Exploring model transferability through the lens of potential energy

X Li, Z Hu, Y Ge, Y Shan… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Transfer learning has become crucial in computer vision tasks due to the vast availability of
pre-trained deep learning models. However, selecting the optimal pre-trained model from a …

Etran: Energy-based transferability estimation

M Gholami, M Akbari, X Wang… - Proceedings of the …, 2023 - openaccess.thecvf.com
This paper addresses the problem of ranking pre-trained models for object detection and
image classification. Selecting the best pre-trained model by fine-tuning is an expensive and …

Model spider: Learning to rank pre-trained models efficiently

YK Zhang, TJ Huang, YX Ding… - Advances in Neural …, 2024 - proceedings.neurips.cc
Abstract Figuring out which Pre-Trained Model (PTM) from a model zoo fits the target task is
essential to take advantage of plentiful model resources. With the availability of numerous …

Analysis of task transferability in large pre-trained classifiers

A Mehra, Y Zhang, J Hamm - arXiv preprint arXiv:2307.00823, 2023 - arxiv.org
Transfer learning transfers the knowledge acquired by a model from a source task to
multiple downstream target tasks with minimal fine-tuning. The success of transfer learning …

Foundation model is efficient multimodal multitask model selector

F Meng, W Shao, Z Peng, C Jiang, K Zhang… - arXiv preprint arXiv …, 2023 - arxiv.org
This paper investigates an under-explored but important problem: given a collection of pre-
trained neural networks, predicting their performance on each multi-modal task without fine …

Foundation Model is Efficient Multimodal Multitask Model Selector

W Shao, C Jiang, K Zhang, Y Qiao… - Advances in Neural …, 2024 - proceedings.neurips.cc
This paper investigates an under-explored but important problem: given a collection of pre-
trained neural networks, predicting their performance on each multi-modal task without fine …

How to determine the most powerful pre-trained language model without brute force fine-tuning? an empirical survey

J Bai, X Zhang, C Li, H Hong, X Xu, C Lin… - arXiv preprint arXiv …, 2023 - arxiv.org
Transferability estimation has been attached to great attention in the computer vision fields.
Researchers try to estimate with low computational cost the performance of a model when …

LEAD: Exploring Logit Space Evolution for Model Selection

Z Hu, X Li, S Tang, J Liu, Y Hu… - Proceedings of the …, 2024 - openaccess.thecvf.com
The remarkable success of" pretrain-then-finetune" paradigm has led to a proliferation of
available pre-trained models for vision tasks. This surge presents a significant challenge in …

Active Transferability Estimation

TR Menta, S Jandial, A Patil, S Bachu… - Proceedings of the …, 2024 - openaccess.thecvf.com
As transfer learning techniques are increasingly used to transfer knowledge from the source
model to the target task it becomes important to quantify which source models are suitable …