### Abstract

We consider the following signal recovery problem: given a measurement matrix Φ ∈ ℝ^{nxp} and a noisy observation vector c ∈ ℝ^{n}constructed from c = Φθ*+ ε where ε ∈ ℝ^{n} is the noise vector whose entries follow i.i.d. centered sub-Gaussian distribution, how to recover the signal θ&z,ast; if Dθ*is sparse under a linear transformation D ∈ ℝ^{mxp}? One natural method using convex optimization is to solve the following problem:(Equation Presented) This paper provides an upper bound of the estimate error and shows the consistency property of this method by assuming that the design matrix $ is a Gaussian random matrix. Specifically, we show 1) in the noiseless case, if the condition number of D is bounded and the measurement number n ≥ Ω(slog(p)) where s is the sparsity number, then the true solution can be recovered with high probability; and 2) in the noisy case, if the condition number of D is bounded and the measurement increases faster than slog(p), that is, slog(p) = o(n), the estimate error converges to zero with probability 1 when p and s go to infinity. Our resuits are consistent with those for the special case D = I_{pxp} (equivalently LASSO) and improve the existing analysis. The condition number of D plays a critical role in our analysis. We consider the condition numbers in two cases including the fused LASSO and the random graph: the condition number in the fused LASSO case is bounded by a constant, while the condition number in the random graph case is bounded with high probability ^{if m/p (i.e., #edge/#vertex) is larger} ^{than} ^{a} certain constant. Numerical simulations are consistent with our theoretical results.

Original language | English (US) |
---|---|

Title of host publication | 30th International Conference on Machine Learning, ICML 2013 |

Publisher | International Machine Learning Society (IMLS) |

Pages | 1128-1136 |

Number of pages | 9 |

Edition | PART 2 |

State | Published - 2013 |

Event | 30th International Conference on Machine Learning, ICML 2013 - Atlanta, GA, United States Duration: Jun 16 2013 → Jun 21 2013 |

### Other

Other | 30th International Conference on Machine Learning, ICML 2013 |
---|---|

Country | United States |

City | Atlanta, GA |

Period | 6/16/13 → 6/21/13 |

### ASJC Scopus subject areas

- Human-Computer Interaction
- Sociology and Political Science

## Fingerprint Dive into the research topics of 'Guaranteed sparse recovery under linear transformation'. Together they form a unique fingerprint.

## Cite this

*30th International Conference on Machine Learning, ICML 2013*(PART 2 ed., pp. 1128-1136). International Machine Learning Society (IMLS).