S Ruder - arXiv preprint arXiv:1706.05098, 2017 - arxiv.org
Multi-task learning (MTL) has led to successes in many applications of machine learning, from natural language processing and speech recognition to computer vision and drug …
With the advent of deep learning, many dense prediction tasks, ie, tasks that produce pixel- level predictions, have seen significant performance improvements. The typical approach is …
C Finn, K Xu, S Levine - Advances in neural information …, 2018 - proceedings.neurips.cc
Meta-learning for few-shot learning entails acquiring a prior over previous tasks and experiences, such that new tasks be learned from small amounts of data. However, a critical …
Y Zhang, Q Yang - IEEE transactions on knowledge and data …, 2021 - ieeexplore.ieee.org
Multi-Task Learning (MTL) is a learning paradigm in machine learning and its aim is to leverage useful information contained in multiple related tasks to help improve the …
Meta-learning allows an intelligent agent to leverage prior learning episodes as a basis for quickly improving performance on a novel task. Bayesian hierarchical modeling provides a …
Multi-task learning (MTL) encapsulates multiple learned tasks in a single model and often lets those tasks learn better jointly. Multi-tasking models have become successful and often …
Lifelong Machine Learning, Second Edition is an introduction to an advanced machine learning paradigm that continuously learns by accumulating past knowledge that it then …
Abstract Sparsely activated Mixture-of-Experts (MoE) is becoming a promising paradigm for multi-task learning (MTL). Instead of compressing multiple tasks' knowledge into a single …
The performance of brain-computer interfaces (BCIs) improves with the amount of available training data; the statistical distribution of this data, however, varies across subjects as well …