Projection-free kernel principal component analysis for denoising

Anh Tuan Bui, Joon Ku Im, Daniel W. Apley, George Runger

Research output: Contribution to journalArticle

Abstract

Kernel principal component analysis (KPCA) forms the basis for a class of methods commonly used for denoising a set of multivariate observations. Most KPCA algorithms involve two steps: projection and preimage approximation. We argue that this two-step procedure can be inefficient and result in poor denoising. We propose an alternative projection-free KPCA denoising approach that does not involve the usual projection and subsequent preimage approximation steps. In order to denoise an observation, our approach performs a single line search along the gradient descent direction of the squared projection error. The rationale is that this moves an observation towards the underlying manifold that represents the noiseless data in the most direct manner possible. We demonstrate that the approach is simple, computationally efficient, robust, and sometimes provides substantially better denoising than the standard KPCA algorithm.

Original languageEnglish (US)
Pages (from-to)163-176
Number of pages14
JournalNeurocomputing
Volume357
DOIs
StatePublished - Sep 10 2019

Fingerprint

Principal Component Analysis
Principal component analysis
Observation

Keywords

  • Feature space
  • Image processing
  • Pattern recognition
  • Preimage problem

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Cite this

Projection-free kernel principal component analysis for denoising. / Bui, Anh Tuan; Im, Joon Ku; Apley, Daniel W.; Runger, George.

In: Neurocomputing, Vol. 357, 10.09.2019, p. 163-176.

Research output: Contribution to journalArticle

Bui, Anh Tuan ; Im, Joon Ku ; Apley, Daniel W. ; Runger, George. / Projection-free kernel principal component analysis for denoising. In: Neurocomputing. 2019 ; Vol. 357. pp. 163-176.
@article{6322872a595b4800ad3b6ef2b226c899,
title = "Projection-free kernel principal component analysis for denoising",
abstract = "Kernel principal component analysis (KPCA) forms the basis for a class of methods commonly used for denoising a set of multivariate observations. Most KPCA algorithms involve two steps: projection and preimage approximation. We argue that this two-step procedure can be inefficient and result in poor denoising. We propose an alternative projection-free KPCA denoising approach that does not involve the usual projection and subsequent preimage approximation steps. In order to denoise an observation, our approach performs a single line search along the gradient descent direction of the squared projection error. The rationale is that this moves an observation towards the underlying manifold that represents the noiseless data in the most direct manner possible. We demonstrate that the approach is simple, computationally efficient, robust, and sometimes provides substantially better denoising than the standard KPCA algorithm.",
keywords = "Feature space, Image processing, Pattern recognition, Preimage problem",
author = "Bui, {Anh Tuan} and Im, {Joon Ku} and Apley, {Daniel W.} and George Runger",
year = "2019",
month = "9",
day = "10",
doi = "10.1016/j.neucom.2019.04.042",
language = "English (US)",
volume = "357",
pages = "163--176",
journal = "Neurocomputing",
issn = "0925-2312",
publisher = "Elsevier",

}

TY - JOUR

T1 - Projection-free kernel principal component analysis for denoising

AU - Bui, Anh Tuan

AU - Im, Joon Ku

AU - Apley, Daniel W.

AU - Runger, George

PY - 2019/9/10

Y1 - 2019/9/10

N2 - Kernel principal component analysis (KPCA) forms the basis for a class of methods commonly used for denoising a set of multivariate observations. Most KPCA algorithms involve two steps: projection and preimage approximation. We argue that this two-step procedure can be inefficient and result in poor denoising. We propose an alternative projection-free KPCA denoising approach that does not involve the usual projection and subsequent preimage approximation steps. In order to denoise an observation, our approach performs a single line search along the gradient descent direction of the squared projection error. The rationale is that this moves an observation towards the underlying manifold that represents the noiseless data in the most direct manner possible. We demonstrate that the approach is simple, computationally efficient, robust, and sometimes provides substantially better denoising than the standard KPCA algorithm.

AB - Kernel principal component analysis (KPCA) forms the basis for a class of methods commonly used for denoising a set of multivariate observations. Most KPCA algorithms involve two steps: projection and preimage approximation. We argue that this two-step procedure can be inefficient and result in poor denoising. We propose an alternative projection-free KPCA denoising approach that does not involve the usual projection and subsequent preimage approximation steps. In order to denoise an observation, our approach performs a single line search along the gradient descent direction of the squared projection error. The rationale is that this moves an observation towards the underlying manifold that represents the noiseless data in the most direct manner possible. We demonstrate that the approach is simple, computationally efficient, robust, and sometimes provides substantially better denoising than the standard KPCA algorithm.

KW - Feature space

KW - Image processing

KW - Pattern recognition

KW - Preimage problem

UR - http://www.scopus.com/inward/record.url?scp=85065910522&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85065910522&partnerID=8YFLogxK

U2 - 10.1016/j.neucom.2019.04.042

DO - 10.1016/j.neucom.2019.04.042

M3 - Article

VL - 357

SP - 163

EP - 176

JO - Neurocomputing

JF - Neurocomputing

SN - 0925-2312

ER -