Abstract

Learning Markov Blankets is important for classification and regression, causal discovery, and Bayesian network learning. We present an argument that ensemble masking measures can provide an approximate Markov Blanket. Consequently, an ensemble feature selection method can be used to learnMarkov Blankets for either discrete or continuous networks (without linear, Gaussian assumptions). We use masking measures for redundancy and statistical inference for feature selection criteria. We compare our performance in the causal structure learning problem to a collection of common feature selection methods.We also compare to Bayesian local structure learning. These results can also be easily extended to other casual structure models such as undirected graphical models.

Original languageEnglish (US)
Title of host publicationStudies in Computational Intelligence
Pages117-131
Number of pages15
Volume373
DOIs
StatePublished - 2011

Publication series

NameStudies in Computational Intelligence
Volume373
ISSN (Print)1860949X

Fingerprint

Feature extraction
Linear networks
Bayesian networks
Model structures
Redundancy

ASJC Scopus subject areas

  • Artificial Intelligence

Cite this

Deng, H., Davila, S., Runger, G., & Tuv, E. (2011). Learning markov blankets for continuous or discrete networks via feature selection. In Studies in Computational Intelligence (Vol. 373, pp. 117-131). (Studies in Computational Intelligence; Vol. 373). https://doi.org/10.1007/978-3-642-22910-7_7

Learning markov blankets for continuous or discrete networks via feature selection. / Deng, Houtao; Davila, Saylisse; Runger, George; Tuv, Eugene.

Studies in Computational Intelligence. Vol. 373 2011. p. 117-131 (Studies in Computational Intelligence; Vol. 373).

Research output: Chapter in Book/Report/Conference proceedingChapter

Deng, H, Davila, S, Runger, G & Tuv, E 2011, Learning markov blankets for continuous or discrete networks via feature selection. in Studies in Computational Intelligence. vol. 373, Studies in Computational Intelligence, vol. 373, pp. 117-131. https://doi.org/10.1007/978-3-642-22910-7_7
Deng H, Davila S, Runger G, Tuv E. Learning markov blankets for continuous or discrete networks via feature selection. In Studies in Computational Intelligence. Vol. 373. 2011. p. 117-131. (Studies in Computational Intelligence). https://doi.org/10.1007/978-3-642-22910-7_7
Deng, Houtao ; Davila, Saylisse ; Runger, George ; Tuv, Eugene. / Learning markov blankets for continuous or discrete networks via feature selection. Studies in Computational Intelligence. Vol. 373 2011. pp. 117-131 (Studies in Computational Intelligence).
@inbook{819af28b20614ad0a7e335b50232d71f,
title = "Learning markov blankets for continuous or discrete networks via feature selection",
abstract = "Learning Markov Blankets is important for classification and regression, causal discovery, and Bayesian network learning. We present an argument that ensemble masking measures can provide an approximate Markov Blanket. Consequently, an ensemble feature selection method can be used to learnMarkov Blankets for either discrete or continuous networks (without linear, Gaussian assumptions). We use masking measures for redundancy and statistical inference for feature selection criteria. We compare our performance in the causal structure learning problem to a collection of common feature selection methods.We also compare to Bayesian local structure learning. These results can also be easily extended to other casual structure models such as undirected graphical models.",
author = "Houtao Deng and Saylisse Davila and George Runger and Eugene Tuv",
year = "2011",
doi = "10.1007/978-3-642-22910-7_7",
language = "English (US)",
isbn = "9783642229091",
volume = "373",
series = "Studies in Computational Intelligence",
pages = "117--131",
booktitle = "Studies in Computational Intelligence",

}

TY - CHAP

T1 - Learning markov blankets for continuous or discrete networks via feature selection

AU - Deng, Houtao

AU - Davila, Saylisse

AU - Runger, George

AU - Tuv, Eugene

PY - 2011

Y1 - 2011

N2 - Learning Markov Blankets is important for classification and regression, causal discovery, and Bayesian network learning. We present an argument that ensemble masking measures can provide an approximate Markov Blanket. Consequently, an ensemble feature selection method can be used to learnMarkov Blankets for either discrete or continuous networks (without linear, Gaussian assumptions). We use masking measures for redundancy and statistical inference for feature selection criteria. We compare our performance in the causal structure learning problem to a collection of common feature selection methods.We also compare to Bayesian local structure learning. These results can also be easily extended to other casual structure models such as undirected graphical models.

AB - Learning Markov Blankets is important for classification and regression, causal discovery, and Bayesian network learning. We present an argument that ensemble masking measures can provide an approximate Markov Blanket. Consequently, an ensemble feature selection method can be used to learnMarkov Blankets for either discrete or continuous networks (without linear, Gaussian assumptions). We use masking measures for redundancy and statistical inference for feature selection criteria. We compare our performance in the causal structure learning problem to a collection of common feature selection methods.We also compare to Bayesian local structure learning. These results can also be easily extended to other casual structure models such as undirected graphical models.

UR - http://www.scopus.com/inward/record.url?scp=80054738913&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=80054738913&partnerID=8YFLogxK

U2 - 10.1007/978-3-642-22910-7_7

DO - 10.1007/978-3-642-22910-7_7

M3 - Chapter

AN - SCOPUS:80054738913

SN - 9783642229091

VL - 373

T3 - Studies in Computational Intelligence

SP - 117

EP - 131

BT - Studies in Computational Intelligence

ER -