Abstract We introduce Task Switching Networks (TSNs), a task-conditioned architecture with a single unified encoder/decoder for efficient multi-task learning. Multiple tasks are …
Transferability estimation has been an essential tool in selecting a pre-trained model and the layers in it for transfer learning, to transfer, so as to maximize the performance on a target …
In recent years, remarkable achievements have been made in artificial intelligence tasks and applications based on deep neural networks (DNNs), especially in the fields of vision …
Abstract Figuring out which Pre-Trained Model (PTM) from a model zoo fits the target task is essential to take advantage of plentiful model resources. With the availability of numerous …
Y Xue, R Yang, X Chen, W Liu… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Deep transfer learning has become increasingly prevalent in various fields such as industry and medical science in recent years. To ensure the successful implementation of target …
Abstract Knowledge distillation has demonstrated encouraging performances in deep model compression. Most existing approaches, however, require massive labeled data to …
J Song, Z Xu, S Wu, G Chen… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
The last decade has witnessed the success of deep learning and the surge of publicly released trained models, which necessitates the quantification of the model functional …
We address the problem of ensemble selection in transfer learning: Given a large pool of source models we want to select an ensemble of models which, after fine-tuning on the …
This paper is concerned with ranking many pre-trained deep neural networks (DNNs), called checkpoints, for the transfer learning to a downstream task. Thanks to the broad use of …