Learning hidden Markov sparse models

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

This paper considers the problem of separating streams of unknown non-stationary signals from under-determined mixtures of sources. The source signals are modeled as a hidden Markov model (HMM) where each state in the Markov chain is determined by a set of on (i.e., active) or off (i.e., inactive) states of the sources, with some unknown probability density functions (pdfs) in the on-state. Under the assumption that the number of active sources is small compared to the total number of sources (thus the sources are sparse), the goal is to recursively estimate the HMM state and the over-complete mixing matrix (subsequently the source signals) for signal recovery. The proposed approach combines the techniques of HMM-based filtering and manifold-based dictionary learning for estimating both the state and the mixing matrix. Specifically, we model the on/off state of the source signals as a hidden Markov model. In particular, we consider only a sparse set of simultaneously active sources. Thus, this setting generalizes the typical scenario considered in dictionary learning in which there is a sparse number of temporally independent active signals. To extract the activity profile of the sources from the observations, a technique known as change-of-measure is used to decouple the observations from the sources by introducing a new probability measure over the set of observations. Under this new measure, the un-normalized conditional densities of the state and the transition matrix of the Markov chain can be computed recursively. Due to the scaling ambiguity of the mixing matrix, we introduce an equivalence relation, which partitions the set of mixing matrices into a set of equivalence classes. Rather than estimating the mixing matrix by imposing the unit-norm constraint, the proposed algorithm searches directly for an equivalence class that contains the true mixing matrix. In our simulations, the proposed recursive algorithm with manifold-based dictionary learning, compared to algorithms with unit-norm constraint, estimates the mixing matrix more efficiently while maintaining high accuracy.

Original languageEnglish (US)
Title of host publication2013 Information Theory and Applications Workshop, ITA 2013 - Conference Proceedings
Pages212-221
Number of pages10
DOIs
StatePublished - 2013
Externally publishedYes
Event2013 Information Theory and Applications Workshop, ITA 2013 - San Diego, CA, United States
Duration: Feb 10 2013Feb 15 2013

Other

Other2013 Information Theory and Applications Workshop, ITA 2013
CountryUnited States
CitySan Diego, CA
Period2/10/132/15/13

Fingerprint

Hidden Markov models
Glossaries
Equivalence classes
Markov processes
Probability density function
Recovery

ASJC Scopus subject areas

  • Computer Science Applications
  • Information Systems

Cite this

Li, L., & Scaglione, A. (2013). Learning hidden Markov sparse models. In 2013 Information Theory and Applications Workshop, ITA 2013 - Conference Proceedings (pp. 212-221). [6502950] https://doi.org/10.1109/ITA.2013.6502950

Learning hidden Markov sparse models. / Li, Lin; Scaglione, Anna.

2013 Information Theory and Applications Workshop, ITA 2013 - Conference Proceedings. 2013. p. 212-221 6502950.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Li, L & Scaglione, A 2013, Learning hidden Markov sparse models. in 2013 Information Theory and Applications Workshop, ITA 2013 - Conference Proceedings., 6502950, pp. 212-221, 2013 Information Theory and Applications Workshop, ITA 2013, San Diego, CA, United States, 2/10/13. https://doi.org/10.1109/ITA.2013.6502950
Li L, Scaglione A. Learning hidden Markov sparse models. In 2013 Information Theory and Applications Workshop, ITA 2013 - Conference Proceedings. 2013. p. 212-221. 6502950 https://doi.org/10.1109/ITA.2013.6502950
Li, Lin ; Scaglione, Anna. / Learning hidden Markov sparse models. 2013 Information Theory and Applications Workshop, ITA 2013 - Conference Proceedings. 2013. pp. 212-221
@inproceedings{2456f8ac1f6947a5af8e4b4fd2c8521d,
title = "Learning hidden Markov sparse models",
abstract = "This paper considers the problem of separating streams of unknown non-stationary signals from under-determined mixtures of sources. The source signals are modeled as a hidden Markov model (HMM) where each state in the Markov chain is determined by a set of on (i.e., active) or off (i.e., inactive) states of the sources, with some unknown probability density functions (pdfs) in the on-state. Under the assumption that the number of active sources is small compared to the total number of sources (thus the sources are sparse), the goal is to recursively estimate the HMM state and the over-complete mixing matrix (subsequently the source signals) for signal recovery. The proposed approach combines the techniques of HMM-based filtering and manifold-based dictionary learning for estimating both the state and the mixing matrix. Specifically, we model the on/off state of the source signals as a hidden Markov model. In particular, we consider only a sparse set of simultaneously active sources. Thus, this setting generalizes the typical scenario considered in dictionary learning in which there is a sparse number of temporally independent active signals. To extract the activity profile of the sources from the observations, a technique known as change-of-measure is used to decouple the observations from the sources by introducing a new probability measure over the set of observations. Under this new measure, the un-normalized conditional densities of the state and the transition matrix of the Markov chain can be computed recursively. Due to the scaling ambiguity of the mixing matrix, we introduce an equivalence relation, which partitions the set of mixing matrices into a set of equivalence classes. Rather than estimating the mixing matrix by imposing the unit-norm constraint, the proposed algorithm searches directly for an equivalence class that contains the true mixing matrix. In our simulations, the proposed recursive algorithm with manifold-based dictionary learning, compared to algorithms with unit-norm constraint, estimates the mixing matrix more efficiently while maintaining high accuracy.",
author = "Lin Li and Anna Scaglione",
year = "2013",
doi = "10.1109/ITA.2013.6502950",
language = "English (US)",
pages = "212--221",
booktitle = "2013 Information Theory and Applications Workshop, ITA 2013 - Conference Proceedings",

}

TY - GEN

T1 - Learning hidden Markov sparse models

AU - Li, Lin

AU - Scaglione, Anna

PY - 2013

Y1 - 2013

N2 - This paper considers the problem of separating streams of unknown non-stationary signals from under-determined mixtures of sources. The source signals are modeled as a hidden Markov model (HMM) where each state in the Markov chain is determined by a set of on (i.e., active) or off (i.e., inactive) states of the sources, with some unknown probability density functions (pdfs) in the on-state. Under the assumption that the number of active sources is small compared to the total number of sources (thus the sources are sparse), the goal is to recursively estimate the HMM state and the over-complete mixing matrix (subsequently the source signals) for signal recovery. The proposed approach combines the techniques of HMM-based filtering and manifold-based dictionary learning for estimating both the state and the mixing matrix. Specifically, we model the on/off state of the source signals as a hidden Markov model. In particular, we consider only a sparse set of simultaneously active sources. Thus, this setting generalizes the typical scenario considered in dictionary learning in which there is a sparse number of temporally independent active signals. To extract the activity profile of the sources from the observations, a technique known as change-of-measure is used to decouple the observations from the sources by introducing a new probability measure over the set of observations. Under this new measure, the un-normalized conditional densities of the state and the transition matrix of the Markov chain can be computed recursively. Due to the scaling ambiguity of the mixing matrix, we introduce an equivalence relation, which partitions the set of mixing matrices into a set of equivalence classes. Rather than estimating the mixing matrix by imposing the unit-norm constraint, the proposed algorithm searches directly for an equivalence class that contains the true mixing matrix. In our simulations, the proposed recursive algorithm with manifold-based dictionary learning, compared to algorithms with unit-norm constraint, estimates the mixing matrix more efficiently while maintaining high accuracy.

AB - This paper considers the problem of separating streams of unknown non-stationary signals from under-determined mixtures of sources. The source signals are modeled as a hidden Markov model (HMM) where each state in the Markov chain is determined by a set of on (i.e., active) or off (i.e., inactive) states of the sources, with some unknown probability density functions (pdfs) in the on-state. Under the assumption that the number of active sources is small compared to the total number of sources (thus the sources are sparse), the goal is to recursively estimate the HMM state and the over-complete mixing matrix (subsequently the source signals) for signal recovery. The proposed approach combines the techniques of HMM-based filtering and manifold-based dictionary learning for estimating both the state and the mixing matrix. Specifically, we model the on/off state of the source signals as a hidden Markov model. In particular, we consider only a sparse set of simultaneously active sources. Thus, this setting generalizes the typical scenario considered in dictionary learning in which there is a sparse number of temporally independent active signals. To extract the activity profile of the sources from the observations, a technique known as change-of-measure is used to decouple the observations from the sources by introducing a new probability measure over the set of observations. Under this new measure, the un-normalized conditional densities of the state and the transition matrix of the Markov chain can be computed recursively. Due to the scaling ambiguity of the mixing matrix, we introduce an equivalence relation, which partitions the set of mixing matrices into a set of equivalence classes. Rather than estimating the mixing matrix by imposing the unit-norm constraint, the proposed algorithm searches directly for an equivalence class that contains the true mixing matrix. In our simulations, the proposed recursive algorithm with manifold-based dictionary learning, compared to algorithms with unit-norm constraint, estimates the mixing matrix more efficiently while maintaining high accuracy.

UR - http://www.scopus.com/inward/record.url?scp=84877666766&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84877666766&partnerID=8YFLogxK

U2 - 10.1109/ITA.2013.6502950

DO - 10.1109/ITA.2013.6502950

M3 - Conference contribution

AN - SCOPUS:84877666766

SP - 212

EP - 221

BT - 2013 Information Theory and Applications Workshop, ITA 2013 - Conference Proceedings

ER -