### Abstract

Data of piecewise smooth images are sometimes acquired as Fourier samples. Standard reconstruction techniques yield the Gibbs phenomenon, causing spurious oscillations at jump discontinuities and an overall reduced rate of convergence to first order away from the jumps. Filtering is an inexpensive way to improve the rate of convergence away from the discontinuities, but it has the adverse side effect of blurring the approximation at the jump locations. On the flip side, high resolution post processing algorithms are often computationally cost prohibitive and also require explicit knowledge of all jump locations. Recent convex optimization algorithms using $$l^1$$l1 regularization exploit the expected sparsity of some features of the image. Wavelets or finite differences are often used to generate the corresponding sparsifying transform and work well for piecewise constant images. They are less useful when there is more variation in the image, however. In this paper we develop a convex optimization algorithm that exploits the sparsity in the edges of the underlying image. We use the polynomial annihilation edge detection method to generate the corresponding sparsifying transform. Our method successfully reduces the Gibbs phenomenon with only minimal blurring at the discontinuities while retaining a high rate of convergence in smooth regions.

Language | English (US) |
---|---|

Pages | 533-552 |

Number of pages | 20 |

Journal | Journal of Scientific Computing |

Volume | 65 |

Issue number | 2 |

DOIs | |

State | Published - Dec 18 2014 |

### Fingerprint

### Keywords

- Convex optimization
- Edge detection
- Fourier data
- l regularization
- Polynomial annihilation

### ASJC Scopus subject areas

- Software
- Computational Theory and Mathematics
- Theoretical Computer Science
- Engineering(all)

### Cite this

*Journal of Scientific Computing*,

*65*(2), 533-552. https://doi.org/10.1007/s10915-014-9973-3

**Image Reconstruction from Fourier Data Using Sparsity of Edges.** / Wasserman, Gabriel; Archibald, Rick; Gelb, Anne.

Research output: Contribution to journal › Article

*Journal of Scientific Computing*, vol. 65, no. 2, pp. 533-552. https://doi.org/10.1007/s10915-014-9973-3

}

TY - JOUR

T1 - Image Reconstruction from Fourier Data Using Sparsity of Edges

AU - Wasserman, Gabriel

AU - Archibald, Rick

AU - Gelb, Anne

PY - 2014/12/18

Y1 - 2014/12/18

N2 - Data of piecewise smooth images are sometimes acquired as Fourier samples. Standard reconstruction techniques yield the Gibbs phenomenon, causing spurious oscillations at jump discontinuities and an overall reduced rate of convergence to first order away from the jumps. Filtering is an inexpensive way to improve the rate of convergence away from the discontinuities, but it has the adverse side effect of blurring the approximation at the jump locations. On the flip side, high resolution post processing algorithms are often computationally cost prohibitive and also require explicit knowledge of all jump locations. Recent convex optimization algorithms using $$l^1$$l1 regularization exploit the expected sparsity of some features of the image. Wavelets or finite differences are often used to generate the corresponding sparsifying transform and work well for piecewise constant images. They are less useful when there is more variation in the image, however. In this paper we develop a convex optimization algorithm that exploits the sparsity in the edges of the underlying image. We use the polynomial annihilation edge detection method to generate the corresponding sparsifying transform. Our method successfully reduces the Gibbs phenomenon with only minimal blurring at the discontinuities while retaining a high rate of convergence in smooth regions.

AB - Data of piecewise smooth images are sometimes acquired as Fourier samples. Standard reconstruction techniques yield the Gibbs phenomenon, causing spurious oscillations at jump discontinuities and an overall reduced rate of convergence to first order away from the jumps. Filtering is an inexpensive way to improve the rate of convergence away from the discontinuities, but it has the adverse side effect of blurring the approximation at the jump locations. On the flip side, high resolution post processing algorithms are often computationally cost prohibitive and also require explicit knowledge of all jump locations. Recent convex optimization algorithms using $$l^1$$l1 regularization exploit the expected sparsity of some features of the image. Wavelets or finite differences are often used to generate the corresponding sparsifying transform and work well for piecewise constant images. They are less useful when there is more variation in the image, however. In this paper we develop a convex optimization algorithm that exploits the sparsity in the edges of the underlying image. We use the polynomial annihilation edge detection method to generate the corresponding sparsifying transform. Our method successfully reduces the Gibbs phenomenon with only minimal blurring at the discontinuities while retaining a high rate of convergence in smooth regions.

KW - Convex optimization

KW - Edge detection

KW - Fourier data

KW - l regularization

KW - Polynomial annihilation

UR - http://www.scopus.com/inward/record.url?scp=84944171803&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84944171803&partnerID=8YFLogxK

U2 - 10.1007/s10915-014-9973-3

DO - 10.1007/s10915-014-9973-3

M3 - Article

VL - 65

SP - 533

EP - 552

JO - Journal of Scientific Computing

T2 - Journal of Scientific Computing

JF - Journal of Scientific Computing

SN - 0885-7474

IS - 2

ER -