Fast adaptive algorithms using eigenspace projections

N. Gopalan Nair, Andreas Spanias

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

Although adaptive gradient algorithms are relatively robust, they generally have poor performance in the absence of "rich" excitation. It is well known that the convergence speed of the LMS algorithm deteriorates when the condition number of the autocorrelation matrix of the input is large. This problem has been addressed using RLS, Weighted RLS (WRLS), as well as normalized frequency-domain algorithms. In this paper, we present an alternative approach that employs gradient projections in selected eigenvector subspaces to improve the convergence properties of LMS algorithms. We also use an auxiliary algorithm that iteratively updates selected eigensubspaces. The proposed algorithm is efficient in terms of complexity and its convergence speed approaches that of the WRLS for a certain class of excitation signals.

Original languageEnglish (US)
Pages (from-to)1929-1935
Number of pages7
JournalSignal Processing
Volume83
Issue number9
DOIs
StatePublished - Sep 2003

Fingerprint

Adaptive algorithms
Autocorrelation
Eigenvalues and eigenfunctions

Keywords

  • Adaptive algorithms
  • Eigenspace estimation
  • Gradient projections

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Cite this

Fast adaptive algorithms using eigenspace projections. / Nair, N. Gopalan; Spanias, Andreas.

In: Signal Processing, Vol. 83, No. 9, 09.2003, p. 1929-1935.

Research output: Contribution to journalArticle

Nair, N. Gopalan ; Spanias, Andreas. / Fast adaptive algorithms using eigenspace projections. In: Signal Processing. 2003 ; Vol. 83, No. 9. pp. 1929-1935.
@article{a5e4f138a24a4ce49485320406f97d85,
title = "Fast adaptive algorithms using eigenspace projections",
abstract = "Although adaptive gradient algorithms are relatively robust, they generally have poor performance in the absence of {"}rich{"} excitation. It is well known that the convergence speed of the LMS algorithm deteriorates when the condition number of the autocorrelation matrix of the input is large. This problem has been addressed using RLS, Weighted RLS (WRLS), as well as normalized frequency-domain algorithms. In this paper, we present an alternative approach that employs gradient projections in selected eigenvector subspaces to improve the convergence properties of LMS algorithms. We also use an auxiliary algorithm that iteratively updates selected eigensubspaces. The proposed algorithm is efficient in terms of complexity and its convergence speed approaches that of the WRLS for a certain class of excitation signals.",
keywords = "Adaptive algorithms, Eigenspace estimation, Gradient projections",
author = "Nair, {N. Gopalan} and Andreas Spanias",
year = "2003",
month = "9",
doi = "10.1016/S0165-1684(03)00111-7",
language = "English (US)",
volume = "83",
pages = "1929--1935",
journal = "Signal Processing",
issn = "0165-1684",
publisher = "Elsevier",
number = "9",

}

TY - JOUR

T1 - Fast adaptive algorithms using eigenspace projections

AU - Nair, N. Gopalan

AU - Spanias, Andreas

PY - 2003/9

Y1 - 2003/9

N2 - Although adaptive gradient algorithms are relatively robust, they generally have poor performance in the absence of "rich" excitation. It is well known that the convergence speed of the LMS algorithm deteriorates when the condition number of the autocorrelation matrix of the input is large. This problem has been addressed using RLS, Weighted RLS (WRLS), as well as normalized frequency-domain algorithms. In this paper, we present an alternative approach that employs gradient projections in selected eigenvector subspaces to improve the convergence properties of LMS algorithms. We also use an auxiliary algorithm that iteratively updates selected eigensubspaces. The proposed algorithm is efficient in terms of complexity and its convergence speed approaches that of the WRLS for a certain class of excitation signals.

AB - Although adaptive gradient algorithms are relatively robust, they generally have poor performance in the absence of "rich" excitation. It is well known that the convergence speed of the LMS algorithm deteriorates when the condition number of the autocorrelation matrix of the input is large. This problem has been addressed using RLS, Weighted RLS (WRLS), as well as normalized frequency-domain algorithms. In this paper, we present an alternative approach that employs gradient projections in selected eigenvector subspaces to improve the convergence properties of LMS algorithms. We also use an auxiliary algorithm that iteratively updates selected eigensubspaces. The proposed algorithm is efficient in terms of complexity and its convergence speed approaches that of the WRLS for a certain class of excitation signals.

KW - Adaptive algorithms

KW - Eigenspace estimation

KW - Gradient projections

UR - http://www.scopus.com/inward/record.url?scp=0042629852&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0042629852&partnerID=8YFLogxK

U2 - 10.1016/S0165-1684(03)00111-7

DO - 10.1016/S0165-1684(03)00111-7

M3 - Article

AN - SCOPUS:0042629852

VL - 83

SP - 1929

EP - 1935

JO - Signal Processing

JF - Signal Processing

SN - 0165-1684

IS - 9

ER -