A convex formulation for learning shared structures from multiple tasks

J Chen, L Tang, J Liu, J Ye - … of the 26th annual international conference …, 2009 - dl.acm.org
Proceedings of the 26th annual international conference on machine learning, 2009dl.acm.org
Multi-task learning (MTL) aims to improve generalization performance by learning multiple
related tasks simultaneously. In this paper, we consider the problem of learning shared
structures from multiple related tasks. We present an improved formulation (i ASO) for multi-
task learning based on the non-convex alternating structure optimization (ASO) algorithm, in
which all tasks are related by a shared feature representation. We convert i ASO, a non-
convex formulation, into a relaxed convex one, which is, however, not scalable to large data …
Multi-task learning (MTL) aims to improve generalization performance by learning multiple related tasks simultaneously. In this paper, we consider the problem of learning shared structures from multiple related tasks. We present an improved formulation (iASO) for multi-task learning based on the non-convex alternating structure optimization (ASO) algorithm, in which all tasks are related by a shared feature representation. We convert iASO, a non-convex formulation, into a relaxed convex one, which is, however, not scalable to large data sets due to its complex constraints. We propose an alternating optimization (cASO) algorithm which solves the convex relaxation efficiently, and further show that cASO converges to a global optimum. In addition, we present a theoretical condition, under which cASO can find a globally optimal solution to iASO. Experiments on several benchmark data sets confirm our theoretical analysis.
ACM Digital Library
以上显示的是最相近的搜索结果。 查看全部搜索结果