Abstract

High-dimensional regression/classification is challenging due to the curse of dimensionality. Lasso [18] and its various extensions [10], which can simultaneously perform feature selection and regression/classification, have received increasing attention in this situation. However, in the presence of highly correlated features lasso tends to only select one of those features resulting in suboptimal performance [25]. Several methods have been proposed to address this issue in the literature. Shen and Ye [15] introduce an adaptive model selection procedure that corrects the estimation bias through a data-driven penalty based on generalized degrees of freedom.

Original languageEnglish (US)
Title of host publicationGraph Embedding for Pattern Analysis
PublisherSpringer New York
Pages27-43
Number of pages17
ISBN (Electronic)9781461444572
ISBN (Print)9781461444565
DOIs
StatePublished - Jan 1 2013

ASJC Scopus subject areas

  • General Engineering

Fingerprint

Dive into the research topics of 'Feature grouping and selection over an undirected graph'. Together they form a unique fingerprint.

Cite this