Multi-stage Dantzig selector

Ji Liu, Peter Wonka, Jieping Ye

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

We consider the following sparse signal recovery (or feature selection) problem: given a design matrix X ∈ ℝ n×m (m ≫ n) and a noisy observation vector y ∈ ℝ n satisfying y = Xβ* +ε where ε is the noise vector following a Gaussian distribution N(0; σ 2I), how to recover the signal (or parameter vector) β* when the signal is sparse? The Dantzig selector has been proposed for sparse signal recovery with strong theoretical guarantees. In this paper, we propose a multi-stage Dantzig selector method, which iteratively refines the target signal β* We show that if X obeys a certain condition, then with a large probability the difference between the solution β̂ estimated by the proposed method and the true solution β* measured in terms of the l p norm (p ≥ 1) is bounded as ∥ β̂- β* p≤ (C(s - N) 1/p √ logm + Δ)σ where C is a constant, s is the number of nonzero entries in β* Δ is independent of m and is much smaller than the first term, and N is the number of entries of β* larger than a certain value in the order of O(σ √ logm). The proposed method improves the estimation bound of the standard Dantzig selector approximately from Cs 1/p √ logmσ to C(s-N) 1/p √ logmσ where the value N depends on the number of large entries in β* When N = s, the proposed algorithm achieves the oracle solution with a high probability. In addition, with a large probability, the proposed method can select the same number of correct features under a milder condition than the Dantzig selector.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
StatePublished - 2010
Event24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 - Vancouver, BC, Canada
Duration: Dec 6 2010Dec 9 2010

Other

Other24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
CountryCanada
CityVancouver, BC
Period12/6/1012/9/10

Fingerprint

Recovery
Gaussian distribution
Feature extraction

ASJC Scopus subject areas

  • Information Systems

Cite this

Liu, J., Wonka, P., & Ye, J. (2010). Multi-stage Dantzig selector. In Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010

Multi-stage Dantzig selector. / Liu, Ji; Wonka, Peter; Ye, Jieping.

Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010. 2010.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Liu, J, Wonka, P & Ye, J 2010, Multi-stage Dantzig selector. in Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010. 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010, Vancouver, BC, Canada, 12/6/10.
Liu J, Wonka P, Ye J. Multi-stage Dantzig selector. In Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010. 2010
Liu, Ji ; Wonka, Peter ; Ye, Jieping. / Multi-stage Dantzig selector. Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010. 2010.
@inproceedings{d0d3500fafb8467eb28aaf7995945c2d,
title = "Multi-stage Dantzig selector",
abstract = "We consider the following sparse signal recovery (or feature selection) problem: given a design matrix X ∈ ℝ n×m (m ≫ n) and a noisy observation vector y ∈ ℝ n satisfying y = Xβ* +ε where ε is the noise vector following a Gaussian distribution N(0; σ 2I), how to recover the signal (or parameter vector) β* when the signal is sparse? The Dantzig selector has been proposed for sparse signal recovery with strong theoretical guarantees. In this paper, we propose a multi-stage Dantzig selector method, which iteratively refines the target signal β* We show that if X obeys a certain condition, then with a large probability the difference between the solution β̂ estimated by the proposed method and the true solution β* measured in terms of the l p norm (p ≥ 1) is bounded as ∥ β̂- β* p≤ (C(s - N) 1/p √ logm + Δ)σ where C is a constant, s is the number of nonzero entries in β* Δ is independent of m and is much smaller than the first term, and N is the number of entries of β* larger than a certain value in the order of O(σ √ logm). The proposed method improves the estimation bound of the standard Dantzig selector approximately from Cs 1/p √ logmσ to C(s-N) 1/p √ logmσ where the value N depends on the number of large entries in β* When N = s, the proposed algorithm achieves the oracle solution with a high probability. In addition, with a large probability, the proposed method can select the same number of correct features under a milder condition than the Dantzig selector.",
author = "Ji Liu and Peter Wonka and Jieping Ye",
year = "2010",
language = "English (US)",
isbn = "9781617823800",
booktitle = "Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010",

}

TY - GEN

T1 - Multi-stage Dantzig selector

AU - Liu, Ji

AU - Wonka, Peter

AU - Ye, Jieping

PY - 2010

Y1 - 2010

N2 - We consider the following sparse signal recovery (or feature selection) problem: given a design matrix X ∈ ℝ n×m (m ≫ n) and a noisy observation vector y ∈ ℝ n satisfying y = Xβ* +ε where ε is the noise vector following a Gaussian distribution N(0; σ 2I), how to recover the signal (or parameter vector) β* when the signal is sparse? The Dantzig selector has been proposed for sparse signal recovery with strong theoretical guarantees. In this paper, we propose a multi-stage Dantzig selector method, which iteratively refines the target signal β* We show that if X obeys a certain condition, then with a large probability the difference between the solution β̂ estimated by the proposed method and the true solution β* measured in terms of the l p norm (p ≥ 1) is bounded as ∥ β̂- β* p≤ (C(s - N) 1/p √ logm + Δ)σ where C is a constant, s is the number of nonzero entries in β* Δ is independent of m and is much smaller than the first term, and N is the number of entries of β* larger than a certain value in the order of O(σ √ logm). The proposed method improves the estimation bound of the standard Dantzig selector approximately from Cs 1/p √ logmσ to C(s-N) 1/p √ logmσ where the value N depends on the number of large entries in β* When N = s, the proposed algorithm achieves the oracle solution with a high probability. In addition, with a large probability, the proposed method can select the same number of correct features under a milder condition than the Dantzig selector.

AB - We consider the following sparse signal recovery (or feature selection) problem: given a design matrix X ∈ ℝ n×m (m ≫ n) and a noisy observation vector y ∈ ℝ n satisfying y = Xβ* +ε where ε is the noise vector following a Gaussian distribution N(0; σ 2I), how to recover the signal (or parameter vector) β* when the signal is sparse? The Dantzig selector has been proposed for sparse signal recovery with strong theoretical guarantees. In this paper, we propose a multi-stage Dantzig selector method, which iteratively refines the target signal β* We show that if X obeys a certain condition, then with a large probability the difference between the solution β̂ estimated by the proposed method and the true solution β* measured in terms of the l p norm (p ≥ 1) is bounded as ∥ β̂- β* p≤ (C(s - N) 1/p √ logm + Δ)σ where C is a constant, s is the number of nonzero entries in β* Δ is independent of m and is much smaller than the first term, and N is the number of entries of β* larger than a certain value in the order of O(σ √ logm). The proposed method improves the estimation bound of the standard Dantzig selector approximately from Cs 1/p √ logmσ to C(s-N) 1/p √ logmσ where the value N depends on the number of large entries in β* When N = s, the proposed algorithm achieves the oracle solution with a high probability. In addition, with a large probability, the proposed method can select the same number of correct features under a milder condition than the Dantzig selector.

UR - http://www.scopus.com/inward/record.url?scp=84860621071&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84860621071&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9781617823800

BT - Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010

ER -