Abstract

Sparse learning has been proven to be a powerful technique in supervised feature selection, which allows to embed feature selection into the classification (or regression) problem. In recent years, increasing attention has been on applying spare learning in unsupervised feature selection. Due to the lack of label information, the vast majority of these algorithms usually generate cluster labels via clustering algorithms and then formulate unsupervised feature selection as sparse learning based supervised feature selection with these generated cluster labels. In this paper, we propose a novel unsupervised feature selection algorithm EUFS, which directly embeds feature selection into a clustering algorithm via sparse learning without the transformation. The Alternating Direction Method of Multipliers is used to address the optimization problem of EUFS. Experimental results on various benchmark datasets demonstrate the effectiveness of the proposed framework EUFS.

Original languageEnglish (US)
Title of host publicationProceedings of the National Conference on Artificial Intelligence
PublisherAI Access Foundation
Pages470-476
Number of pages7
Volume1
ISBN (Print)9781577356998
Publication statusPublished - Jun 1 2015
Event29th AAAI Conference on Artificial Intelligence, AAAI 2015 and the 27th Innovative Applications of Artificial Intelligence Conference, IAAI 2015 - Austin, United States
Duration: Jan 25 2015Jan 30 2015

Other

Other29th AAAI Conference on Artificial Intelligence, AAAI 2015 and the 27th Innovative Applications of Artificial Intelligence Conference, IAAI 2015
CountryUnited States
CityAustin
Period1/25/151/30/15

    Fingerprint

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Wang, S., Tang, J., & Liu, H. (2015). Embedded unsupervised feature selection. In Proceedings of the National Conference on Artificial Intelligence (Vol. 1, pp. 470-476). AI Access Foundation.