Abstract

Mutual Information (MI) is often used for feature selection when developing classifier models. Estimating the MI for a subset of features is often intractable. We demonstrate, that under the assumptions of conditional independence, MI between a subset of features can be expressed as the Conditional Mutual Information (CMI) between pairs of features. But selecting features with the highest CMI turns out to be a hard combinatorial problem. In this work, we have applied two unique global methods, Truncated Power Method (TPower) and Low Rank Bilinear Approximation (LowRank), to solve the feature selection problem. These algorithms provide very good approximations to the NP-hard CMI based feature selection problem. We experimentally demonstrate the effectiveness of these procedures across multiple datasets and compare them with existing MI based global and iterative feature selection procedures.

Original languageEnglish (US)
Title of host publicationProceedings - 15th IEEE International Conference on Data Mining, ICDM 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1009-1014
Number of pages6
Volume2016-January
ISBN (Electronic)9781467395038
DOIs
StatePublished - Jan 5 2016
Event15th IEEE International Conference on Data Mining, ICDM 2015 - Atlantic City, United States
Duration: Nov 14 2015Nov 17 2015

Other

Other15th IEEE International Conference on Data Mining, ICDM 2015
CountryUnited States
CityAtlantic City
Period11/14/1511/17/15

Fingerprint

Feature extraction
Classifiers

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Demakethepalli Venkateswara, H., Lade, P., Lin, B., Ye, J., & Panchanathan, S. (2016). Efficient approximate solutions to mutual information based global feature selection. In Proceedings - 15th IEEE International Conference on Data Mining, ICDM 2015 (Vol. 2016-January, pp. 1009-1014). [7373427] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICDM.2015.140

Efficient approximate solutions to mutual information based global feature selection. / Demakethepalli Venkateswara, Hemanth; Lade, Prasanth; Lin, Binbin; Ye, Jieping; Panchanathan, Sethuraman.

Proceedings - 15th IEEE International Conference on Data Mining, ICDM 2015. Vol. 2016-January Institute of Electrical and Electronics Engineers Inc., 2016. p. 1009-1014 7373427.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Demakethepalli Venkateswara, H, Lade, P, Lin, B, Ye, J & Panchanathan, S 2016, Efficient approximate solutions to mutual information based global feature selection. in Proceedings - 15th IEEE International Conference on Data Mining, ICDM 2015. vol. 2016-January, 7373427, Institute of Electrical and Electronics Engineers Inc., pp. 1009-1014, 15th IEEE International Conference on Data Mining, ICDM 2015, Atlantic City, United States, 11/14/15. https://doi.org/10.1109/ICDM.2015.140
Demakethepalli Venkateswara H, Lade P, Lin B, Ye J, Panchanathan S. Efficient approximate solutions to mutual information based global feature selection. In Proceedings - 15th IEEE International Conference on Data Mining, ICDM 2015. Vol. 2016-January. Institute of Electrical and Electronics Engineers Inc. 2016. p. 1009-1014. 7373427 https://doi.org/10.1109/ICDM.2015.140
Demakethepalli Venkateswara, Hemanth ; Lade, Prasanth ; Lin, Binbin ; Ye, Jieping ; Panchanathan, Sethuraman. / Efficient approximate solutions to mutual information based global feature selection. Proceedings - 15th IEEE International Conference on Data Mining, ICDM 2015. Vol. 2016-January Institute of Electrical and Electronics Engineers Inc., 2016. pp. 1009-1014
@inproceedings{72c760539edd42669e95e33c7b63a7d5,
title = "Efficient approximate solutions to mutual information based global feature selection",
abstract = "Mutual Information (MI) is often used for feature selection when developing classifier models. Estimating the MI for a subset of features is often intractable. We demonstrate, that under the assumptions of conditional independence, MI between a subset of features can be expressed as the Conditional Mutual Information (CMI) between pairs of features. But selecting features with the highest CMI turns out to be a hard combinatorial problem. In this work, we have applied two unique global methods, Truncated Power Method (TPower) and Low Rank Bilinear Approximation (LowRank), to solve the feature selection problem. These algorithms provide very good approximations to the NP-hard CMI based feature selection problem. We experimentally demonstrate the effectiveness of these procedures across multiple datasets and compare them with existing MI based global and iterative feature selection procedures.",
author = "{Demakethepalli Venkateswara}, Hemanth and Prasanth Lade and Binbin Lin and Jieping Ye and Sethuraman Panchanathan",
year = "2016",
month = "1",
day = "5",
doi = "10.1109/ICDM.2015.140",
language = "English (US)",
volume = "2016-January",
pages = "1009--1014",
booktitle = "Proceedings - 15th IEEE International Conference on Data Mining, ICDM 2015",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - Efficient approximate solutions to mutual information based global feature selection

AU - Demakethepalli Venkateswara, Hemanth

AU - Lade, Prasanth

AU - Lin, Binbin

AU - Ye, Jieping

AU - Panchanathan, Sethuraman

PY - 2016/1/5

Y1 - 2016/1/5

N2 - Mutual Information (MI) is often used for feature selection when developing classifier models. Estimating the MI for a subset of features is often intractable. We demonstrate, that under the assumptions of conditional independence, MI between a subset of features can be expressed as the Conditional Mutual Information (CMI) between pairs of features. But selecting features with the highest CMI turns out to be a hard combinatorial problem. In this work, we have applied two unique global methods, Truncated Power Method (TPower) and Low Rank Bilinear Approximation (LowRank), to solve the feature selection problem. These algorithms provide very good approximations to the NP-hard CMI based feature selection problem. We experimentally demonstrate the effectiveness of these procedures across multiple datasets and compare them with existing MI based global and iterative feature selection procedures.

AB - Mutual Information (MI) is often used for feature selection when developing classifier models. Estimating the MI for a subset of features is often intractable. We demonstrate, that under the assumptions of conditional independence, MI between a subset of features can be expressed as the Conditional Mutual Information (CMI) between pairs of features. But selecting features with the highest CMI turns out to be a hard combinatorial problem. In this work, we have applied two unique global methods, Truncated Power Method (TPower) and Low Rank Bilinear Approximation (LowRank), to solve the feature selection problem. These algorithms provide very good approximations to the NP-hard CMI based feature selection problem. We experimentally demonstrate the effectiveness of these procedures across multiple datasets and compare them with existing MI based global and iterative feature selection procedures.

UR - http://www.scopus.com/inward/record.url?scp=84963568328&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84963568328&partnerID=8YFLogxK

U2 - 10.1109/ICDM.2015.140

DO - 10.1109/ICDM.2015.140

M3 - Conference contribution

VL - 2016-January

SP - 1009

EP - 1014

BT - Proceedings - 15th IEEE International Conference on Data Mining, ICDM 2015

PB - Institute of Electrical and Electronics Engineers Inc.

ER -