Efficiently identifying task groupings for multi-task learning

C Fifty, E Amid, Z Zhao, T Yu… - Advances in Neural …, 2021 - proceedings.neurips.cc
Multi-task learning can leverage information learned by one task to benefit the training of
other tasks. Despite this capacity, naively training all tasks together in one model often …

Multi-task learning with deep neural networks: A survey

M Crawshaw - arXiv preprint arXiv:2009.09796, 2020 - arxiv.org
Multi-task learning (MTL) is a subfield of machine learning in which multiple tasks are
simultaneously learned by a shared model. Such approaches offer advantages like …

Multi-task learning for dense prediction tasks: A survey

S Vandenhende, S Georgoulis… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
With the advent of deep learning, many dense prediction tasks, ie, tasks that produce pixel-
level predictions, have seen significant performance improvements. The typical approach is …

Which tasks should be learned together in multi-task learning?

T Standley, A Zamir, D Chen, L Guibas… - International …, 2020 - proceedings.mlr.press
Many computer vision applications require solving multiple tasks in real-time. A neural
network can be trained to solve multiple tasks simultaneously using multi-task learning. This …

Mult: An end-to-end multitask learning transformer

D Bhattacharjee, T Zhang… - Proceedings of the …, 2022 - openaccess.thecvf.com
We propose an end-to-end Multitask Learning Transformer framework, named MulT, to
simultaneously learn multiple high-level vision tasks, including depth estimation, semantic …

Ensembling off-the-shelf models for gan training

N Kumari, R Zhang, E Shechtman… - Proceedings of the …, 2022 - openaccess.thecvf.com
The advent of large-scale training has produced a cornucopia of powerful visual recognition
models. However, generative models, such as GANs, have traditionally been trained from …

When Multitask Learning Meets Partial Supervision: A Computer Vision Review

M Fontana, M Spratling, M Shi - Proceedings of the IEEE, 2024 - ieeexplore.ieee.org
Multitask learning (MTL) aims to learn multiple tasks simultaneously while exploiting their
mutual relationships. By using shared resources to simultaneously calculate multiple …

What makes instance discrimination good for transfer learning?

N Zhao, Z Wu, RWH Lau, S Lin - arXiv preprint arXiv:2006.06606, 2020 - arxiv.org
Contrastive visual pretraining based on the instance discrimination pretext task has made
significant progress. Notably, recent work on unsupervised pretraining has shown to surpass …

Auto-lambda: Disentangling dynamic task relationships

S Liu, S James, AJ Davison, E Johns - arXiv preprint arXiv:2202.03091, 2022 - arxiv.org
Understanding the structure of multiple related tasks allows for multi-task learning to improve
the generalisation ability of one or all of them. However, it usually requires training each …

Branched multi-task networks: deciding what layers to share

S Vandenhende, S Georgoulis… - arXiv preprint arXiv …, 2019 - arxiv.org
In the context of multi-task learning, neural networks with branched architectures have often
been employed to jointly tackle the tasks at hand. Such ramified networks typically start with …