Efficient recovery of jointly sparse vectors

L Sun, J Liu, J Chen, J Ye - Advances in Neural Information …, 2009 - proceedings.neurips.cc
Advances in Neural Information Processing Systems, 2009proceedings.neurips.cc
We consider the reconstruction of sparse signals in the multiple measurement vector (MMV)
model, in which the signal, represented as a matrix, consists of a set of jointly sparse vectors.
MMV is an extension of the single measurement vector (SMV) model employed in standard
compressive sensing (CS). Recent theoretical studies focus on the convex relaxation of the
MMV problem based on the $(2, 1) $-norm minimization, which is an extension of the well-
known $1 $-norm minimization employed in SMV. However, the resulting convex …
Abstract
We consider the reconstruction of sparse signals in the multiple measurement vector (MMV) model, in which the signal, represented as a matrix, consists of a set of jointly sparse vectors. MMV is an extension of the single measurement vector (SMV) model employed in standard compressive sensing (CS). Recent theoretical studies focus on the convex relaxation of the MMV problem based on the -norm minimization, which is an extension of the well-known -norm minimization employed in SMV. However, the resulting convex optimization problem in MMV is significantly much more difficult to solve than the one in SMV. Existing algorithms reformulate it as a second-order cone programming (SOCP) or semidefinite programming (SDP), which is computationally expensive to solve for problems of moderate size. In this paper, we propose a new (dual) reformulation of the convex optimization problem in MMV and develop an efficient algorithm based on the prox-method. Interestingly, our theoretical analysis reveals the close connection between the proposed reformulation and multiple kernel learning. Our simulation studies demonstrate the scalability of the proposed algorithm.
proceedings.neurips.cc
以上显示的是最相近的搜索结果。 查看全部搜索结果