Efficient recovery of jointly sparse vectors

Liang Sun, Jun Liu, Jianhui Chen, Jieping Ye

Research output: Chapter in Book/Report/Conference proceedingConference contribution

33 Citations (Scopus)

Abstract

We consider the reconstruction of sparse signals in the multiple measurement vector (MMV) model, in which the signal, represented as a matrix, consists of a set of jointly sparse vectors. MMV is an extension of the single measurement vector (SMV) model employed in standard compressive sensing (CS). Recent theoretical studies focus on the convex relaxation of the MMV problem based on the (2, 1)-norm minimization, which is an extension of the well-known 1-norm minimization employed in SMV. However, the resulting convex optimization problem in MMV is significantly much more difficult to solve than the one in SMV. Existing algorithms reformulate it as a second-order cone programming (SOCP) or semidefinite programming (SDP) problem, which is computationally expensive to solve for problems of moderate size. In this paper, we propose a new (dual) reformulation of the convex optimization problem in MMV and develop an efficient algorithm based on the prox-method. Interestingly, our theoretical analysis reveals the close connection between the proposed reformulation and multiple kernel learning. Our simulation studies demonstrate the scalability of the proposed algorithm.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference
Pages1812-1820
Number of pages9
StatePublished - 2009
Event23rd Annual Conference on Neural Information Processing Systems, NIPS 2009 - Vancouver, BC, Canada
Duration: Dec 7 2009Dec 10 2009

Other

Other23rd Annual Conference on Neural Information Processing Systems, NIPS 2009
CountryCanada
CityVancouver, BC
Period12/7/0912/10/09

Fingerprint

Recovery
Convex optimization
Scalability
Cones

ASJC Scopus subject areas

  • Information Systems

Cite this

Sun, L., Liu, J., Chen, J., & Ye, J. (2009). Efficient recovery of jointly sparse vectors. In Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference (pp. 1812-1820)

Efficient recovery of jointly sparse vectors. / Sun, Liang; Liu, Jun; Chen, Jianhui; Ye, Jieping.

Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference. 2009. p. 1812-1820.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Sun, L, Liu, J, Chen, J & Ye, J 2009, Efficient recovery of jointly sparse vectors. in Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference. pp. 1812-1820, 23rd Annual Conference on Neural Information Processing Systems, NIPS 2009, Vancouver, BC, Canada, 12/7/09.
Sun L, Liu J, Chen J, Ye J. Efficient recovery of jointly sparse vectors. In Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference. 2009. p. 1812-1820
Sun, Liang ; Liu, Jun ; Chen, Jianhui ; Ye, Jieping. / Efficient recovery of jointly sparse vectors. Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference. 2009. pp. 1812-1820
@inproceedings{6460004df295407baebb606240d83277,
title = "Efficient recovery of jointly sparse vectors",
abstract = "We consider the reconstruction of sparse signals in the multiple measurement vector (MMV) model, in which the signal, represented as a matrix, consists of a set of jointly sparse vectors. MMV is an extension of the single measurement vector (SMV) model employed in standard compressive sensing (CS). Recent theoretical studies focus on the convex relaxation of the MMV problem based on the (2, 1)-norm minimization, which is an extension of the well-known 1-norm minimization employed in SMV. However, the resulting convex optimization problem in MMV is significantly much more difficult to solve than the one in SMV. Existing algorithms reformulate it as a second-order cone programming (SOCP) or semidefinite programming (SDP) problem, which is computationally expensive to solve for problems of moderate size. In this paper, we propose a new (dual) reformulation of the convex optimization problem in MMV and develop an efficient algorithm based on the prox-method. Interestingly, our theoretical analysis reveals the close connection between the proposed reformulation and multiple kernel learning. Our simulation studies demonstrate the scalability of the proposed algorithm.",
author = "Liang Sun and Jun Liu and Jianhui Chen and Jieping Ye",
year = "2009",
language = "English (US)",
isbn = "9781615679119",
pages = "1812--1820",
booktitle = "Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference",

}

TY - GEN

T1 - Efficient recovery of jointly sparse vectors

AU - Sun, Liang

AU - Liu, Jun

AU - Chen, Jianhui

AU - Ye, Jieping

PY - 2009

Y1 - 2009

N2 - We consider the reconstruction of sparse signals in the multiple measurement vector (MMV) model, in which the signal, represented as a matrix, consists of a set of jointly sparse vectors. MMV is an extension of the single measurement vector (SMV) model employed in standard compressive sensing (CS). Recent theoretical studies focus on the convex relaxation of the MMV problem based on the (2, 1)-norm minimization, which is an extension of the well-known 1-norm minimization employed in SMV. However, the resulting convex optimization problem in MMV is significantly much more difficult to solve than the one in SMV. Existing algorithms reformulate it as a second-order cone programming (SOCP) or semidefinite programming (SDP) problem, which is computationally expensive to solve for problems of moderate size. In this paper, we propose a new (dual) reformulation of the convex optimization problem in MMV and develop an efficient algorithm based on the prox-method. Interestingly, our theoretical analysis reveals the close connection between the proposed reformulation and multiple kernel learning. Our simulation studies demonstrate the scalability of the proposed algorithm.

AB - We consider the reconstruction of sparse signals in the multiple measurement vector (MMV) model, in which the signal, represented as a matrix, consists of a set of jointly sparse vectors. MMV is an extension of the single measurement vector (SMV) model employed in standard compressive sensing (CS). Recent theoretical studies focus on the convex relaxation of the MMV problem based on the (2, 1)-norm minimization, which is an extension of the well-known 1-norm minimization employed in SMV. However, the resulting convex optimization problem in MMV is significantly much more difficult to solve than the one in SMV. Existing algorithms reformulate it as a second-order cone programming (SOCP) or semidefinite programming (SDP) problem, which is computationally expensive to solve for problems of moderate size. In this paper, we propose a new (dual) reformulation of the convex optimization problem in MMV and develop an efficient algorithm based on the prox-method. Interestingly, our theoretical analysis reveals the close connection between the proposed reformulation and multiple kernel learning. Our simulation studies demonstrate the scalability of the proposed algorithm.

UR - http://www.scopus.com/inward/record.url?scp=84862300335&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84862300335&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84862300335

SN - 9781615679119

SP - 1812

EP - 1820

BT - Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference

ER -