Recent advances in Bayesian optimization

X Wang, Y Jin, S Schmitt, M Olhofer - ACM Computing Surveys, 2023 - dl.acm.org
Bayesian optimization has emerged at the forefront of expensive black-box optimization due
to its data efficiency. Recent years have witnessed a proliferation of studies on the …

An overview of multi-task learning in deep neural networks

S Ruder - arXiv preprint arXiv:1706.05098, 2017 - arxiv.org
Multi-task learning (MTL) has led to successes in many applications of machine learning,
from natural language processing and speech recognition to computer vision and drug …

Multi-task learning for dense prediction tasks: A survey

S Vandenhende, S Georgoulis… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
With the advent of deep learning, many dense prediction tasks, ie, tasks that produce pixel-
level predictions, have seen significant performance improvements. The typical approach is …

Probabilistic model-agnostic meta-learning

C Finn, K Xu, S Levine - Advances in neural information …, 2018 - proceedings.neurips.cc
Meta-learning for few-shot learning entails acquiring a prior over previous tasks and
experiences, such that new tasks be learned from small amounts of data. However, a critical …

A survey on multi-task learning

Y Zhang, Q Yang - IEEE transactions on knowledge and data …, 2021 - ieeexplore.ieee.org
Multi-Task Learning (MTL) is a learning paradigm in machine learning and its aim is to
leverage useful information contained in multiple related tasks to help improve the …

Recasting gradient-based meta-learning as hierarchical bayes

E Grant, C Finn, S Levine, T Darrell… - arXiv preprint arXiv …, 2018 - arxiv.org
Meta-learning allows an intelligent agent to leverage prior learning episodes as a basis for
quickly improving performance on a novel task. Bayesian hierarchical modeling provides a …

M³vit: Mixture-of-experts vision transformer for efficient multi-task learning with model-accelerator co-design

Z Fan, R Sarkar, Z Jiang, T Chen… - Advances in …, 2022 - proceedings.neurips.cc
Multi-task learning (MTL) encapsulates multiple learned tasks in a single model and often
lets those tasks learn better jointly. Multi-tasking models have become successful and often …

[图书][B] Lifelong machine learning

Z Chen, B Liu - 2022 - books.google.com
Lifelong Machine Learning, Second Edition is an introduction to an advanced machine
learning paradigm that continuously learns by accumulating past knowledge that it then …

Adamv-moe: Adaptive multi-task vision mixture-of-experts

T Chen, X Chen, X Du, A Rashwan… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Sparsely activated Mixture-of-Experts (MoE) is becoming a promising paradigm for
multi-task learning (MTL). Instead of compressing multiple tasks' knowledge into a single …

Transfer learning in brain-computer interfaces

V Jayaram, M Alamgir, Y Altun… - IEEE Computational …, 2016 - ieeexplore.ieee.org
The performance of brain-computer interfaces (BCIs) improves with the amount of available
training data; the statistical distribution of this data, however, varies across subjects as well …