Image Reconstruction from Fourier Data Using Sparsity of Edges

Gabriel Wasserman, Rick Archibald, Anne Gelb

Research output: Contribution to journalArticle

7 Scopus citations

Abstract

Data of piecewise smooth images are sometimes acquired as Fourier samples. Standard reconstruction techniques yield the Gibbs phenomenon, causing spurious oscillations at jump discontinuities and an overall reduced rate of convergence to first order away from the jumps. Filtering is an inexpensive way to improve the rate of convergence away from the discontinuities, but it has the adverse side effect of blurring the approximation at the jump locations. On the flip side, high resolution post processing algorithms are often computationally cost prohibitive and also require explicit knowledge of all jump locations. Recent convex optimization algorithms using $$l^1$$l1 regularization exploit the expected sparsity of some features of the image. Wavelets or finite differences are often used to generate the corresponding sparsifying transform and work well for piecewise constant images. They are less useful when there is more variation in the image, however. In this paper we develop a convex optimization algorithm that exploits the sparsity in the edges of the underlying image. We use the polynomial annihilation edge detection method to generate the corresponding sparsifying transform. Our method successfully reduces the Gibbs phenomenon with only minimal blurring at the discontinuities while retaining a high rate of convergence in smooth regions.

Original languageEnglish (US)
Pages (from-to)533-552
Number of pages20
JournalJournal of Scientific Computing
Volume65
Issue number2
DOIs
StatePublished - Dec 18 2014

    Fingerprint

Keywords

  • Convex optimization
  • Edge detection
  • Fourier data
  • l regularization
  • Polynomial annihilation

ASJC Scopus subject areas

  • Software
  • Computational Theory and Mathematics
  • Theoretical Computer Science
  • Engineering(all)

Cite this