Using Lasso for Predictor Selection and to Assuage Overfitting: A Method Long Overlooked in Behavioral Sciences

Research output: Contribution to journalArticle

38 Citations (Scopus)

Abstract

Ordinary least squares and stepwise selection are widespread in behavioral science research; however, these methods are well known to encounter overfitting problems such that R2 and regression coefficients may be inflated while standard errors and p values may be deflated, ultimately reducing both the parsimony of the model and the generalizability of conclusions. More optimal methods for selecting predictors and estimating regression coefficients such as regularization methods (e.g., Lasso) have existed for decades, are widely implemented in other disciplines, and are available in mainstream software, yet, these methods are essentially invisible in the behavioral science literature while the use of sub optimal methods continues to proliferate. This paper discusses potential issues with standard statistical models, provides an introduction to regularization with specific details on both Lasso and its related predecessor ridge regression, provides an example analysis and code for running a Lasso analysis in R and SAS, and discusses limitations and related methods.

Original languageEnglish (US)
Pages (from-to)471-484
Number of pages14
JournalMultivariate Behavioral Research
Volume50
Issue number5
DOIs
StatePublished - Jan 1 2015
Externally publishedYes

Fingerprint

Behavioral Sciences
Lasso
Overfitting
Predictors
Regression Coefficient
Medicine in Literature
Parsimony
Ridge Regression
Ordinary Least Squares
Behavioral Research
Regularization Method
p-Value
Standard error
Statistical Model
Statistical Models
Standard Model
Least-Squares Analysis
Regularization
Continue
alachlor

Keywords

  • lasso
  • overfitting
  • regression
  • regularization

ASJC Scopus subject areas

  • Statistics and Probability
  • Experimental and Cognitive Psychology
  • Arts and Humanities (miscellaneous)

Cite this

@article{0f84ebfdc58e4405b8d57a0b73de0a81,
title = "Using Lasso for Predictor Selection and to Assuage Overfitting: A Method Long Overlooked in Behavioral Sciences",
abstract = "Ordinary least squares and stepwise selection are widespread in behavioral science research; however, these methods are well known to encounter overfitting problems such that R2 and regression coefficients may be inflated while standard errors and p values may be deflated, ultimately reducing both the parsimony of the model and the generalizability of conclusions. More optimal methods for selecting predictors and estimating regression coefficients such as regularization methods (e.g., Lasso) have existed for decades, are widely implemented in other disciplines, and are available in mainstream software, yet, these methods are essentially invisible in the behavioral science literature while the use of sub optimal methods continues to proliferate. This paper discusses potential issues with standard statistical models, provides an introduction to regularization with specific details on both Lasso and its related predecessor ridge regression, provides an example analysis and code for running a Lasso analysis in R and SAS, and discusses limitations and related methods.",
keywords = "lasso, overfitting, regression, regularization",
author = "Daniel McNeish",
year = "2015",
month = "1",
day = "1",
doi = "10.1080/00273171.2015.1036965",
language = "English (US)",
volume = "50",
pages = "471--484",
journal = "Multivariate Behavioral Research",
issn = "0027-3171",
publisher = "Psychology Press Ltd",
number = "5",

}

TY - JOUR

T1 - Using Lasso for Predictor Selection and to Assuage Overfitting

T2 - A Method Long Overlooked in Behavioral Sciences

AU - McNeish, Daniel

PY - 2015/1/1

Y1 - 2015/1/1

N2 - Ordinary least squares and stepwise selection are widespread in behavioral science research; however, these methods are well known to encounter overfitting problems such that R2 and regression coefficients may be inflated while standard errors and p values may be deflated, ultimately reducing both the parsimony of the model and the generalizability of conclusions. More optimal methods for selecting predictors and estimating regression coefficients such as regularization methods (e.g., Lasso) have existed for decades, are widely implemented in other disciplines, and are available in mainstream software, yet, these methods are essentially invisible in the behavioral science literature while the use of sub optimal methods continues to proliferate. This paper discusses potential issues with standard statistical models, provides an introduction to regularization with specific details on both Lasso and its related predecessor ridge regression, provides an example analysis and code for running a Lasso analysis in R and SAS, and discusses limitations and related methods.

AB - Ordinary least squares and stepwise selection are widespread in behavioral science research; however, these methods are well known to encounter overfitting problems such that R2 and regression coefficients may be inflated while standard errors and p values may be deflated, ultimately reducing both the parsimony of the model and the generalizability of conclusions. More optimal methods for selecting predictors and estimating regression coefficients such as regularization methods (e.g., Lasso) have existed for decades, are widely implemented in other disciplines, and are available in mainstream software, yet, these methods are essentially invisible in the behavioral science literature while the use of sub optimal methods continues to proliferate. This paper discusses potential issues with standard statistical models, provides an introduction to regularization with specific details on both Lasso and its related predecessor ridge regression, provides an example analysis and code for running a Lasso analysis in R and SAS, and discusses limitations and related methods.

KW - lasso

KW - overfitting

KW - regression

KW - regularization

UR - http://www.scopus.com/inward/record.url?scp=84944058260&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84944058260&partnerID=8YFLogxK

U2 - 10.1080/00273171.2015.1036965

DO - 10.1080/00273171.2015.1036965

M3 - Article

C2 - 26610247

AN - SCOPUS:84944058260

VL - 50

SP - 471

EP - 484

JO - Multivariate Behavioral Research

JF - Multivariate Behavioral Research

SN - 0027-3171

IS - 5

ER -