Joint Maximum Likelihood Estimation for Diagnostic Classification Models

Chia Yi Chiu, Hans Friedrich Köhn, Yi Zheng, Robert Henson

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

Joint maximum likelihood estimation (JMLE) is developed for diagnostic classification models (DCMs). JMLE has been barely used in Psychometrics because JMLE parameter estimators typically lack statistical consistency. The JMLE procedure presented here resolves the consistency issue by incorporating an external, statistically consistent estimator of examinees’ proficiency class membership into the joint likelihood function, which subsequently allows for the construction of item parameter estimators that also have the consistency property. Consistency of the JMLE parameter estimators is established within the framework of general DCMs: The JMLE parameter estimators are derived for the Loglinear Cognitive Diagnosis Model (LCDM). Two consistency theorems are proven for the LCDM. Using the framework of general DCMs makes the results and proofs also applicable to DCMs that can be expressed as submodels of the LCDM. Simulation studies are reported for evaluating the performance of JMLE when used with tests of varying length and different numbers of attributes. As a practical application, JMLE is also used with “real world” educational data collected with a language proficiency test.

Original languageEnglish (US)
Pages (from-to)1069-1092
Number of pages24
JournalPsychometrika
Volume81
Issue number4
DOIs
StatePublished - Dec 1 2016

Keywords

  • cognitive diagnosis
  • joint maximum likelihood estimation
  • nonparametric classification
  • statistical consistency

ASJC Scopus subject areas

  • General Psychology
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Joint Maximum Likelihood Estimation for Diagnostic Classification Models'. Together they form a unique fingerprint.

Cite this