Learning markov blankets for continuous or discrete networks via feature selection

Houtao Deng, Saylisse Davila, George Runger, Eugene Tuv

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

Learning Markov Blankets is important for classification and regression, causal discovery, and Bayesian network learning. We present an argument that ensemble masking measures can provide an approximate Markov Blanket. Consequently, an ensemble feature selection method can be used to learnMarkov Blankets for either discrete or continuous networks (without linear, Gaussian assumptions). We use masking measures for redundancy and statistical inference for feature selection criteria. We compare our performance in the causal structure learning problem to a collection of common feature selection methods.We also compare to Bayesian local structure learning. These results can also be easily extended to other casual structure models such as undirected graphical models.

Original languageEnglish (US)
Title of host publicationEnsembles in Machine Learning Applications
EditorsOleg Okun, Giorgio Valentini, Matteo Re
Pages117-131
Number of pages15
DOIs
StatePublished - 2011

Publication series

NameStudies in Computational Intelligence
Volume373
ISSN (Print)1860-949X

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Learning markov blankets for continuous or discrete networks via feature selection'. Together they form a unique fingerprint.

Cite this