Online Calibration of a Joint Model of Item Responses and Response Times in Computerized Adaptive Testing

Hyeon Ah Kang, Yi Zheng, Hua Hua Chang

Research output: Contribution to journalArticle

Abstract

With the widespread use of computers in modern assessment, online calibration has become increasingly popular as a way of replenishing an item pool. The present study discusses online calibration strategies for a joint model of responses and response times. The study proposes likelihood inference methods for item paramter estimation and evaluates their performance along with optimal sampling procedures. An extensive simulation study indicates that the proposed online calibration strategies perform well with relatively small samples (e.g., 500∼800 examinees). The analysis of estimated parameters suggests that response time information can be used to improve the recovery of the response model parameters. Among a number of sampling methods investigated, A-optimal sampling was found most advantageous when the item parameters were weakly correlated. When the parameters were strongly correlated, D-optimal sampling tended to achieve the most accurate parameter recovery. The study provides guidelines for deciding sampling design under a specific goal of online calibration given the characteristics of field-testing items.

Original languageEnglish (US)
JournalJournal of Educational and Behavioral Statistics
DOIs
StateAccepted/In press - Jan 1 2019
Externally publishedYes

Fingerprint

time
simulation
performance

Keywords

  • computerized adaptive testing
  • item response theory
  • online calibration
  • optimal sampling
  • response time

ASJC Scopus subject areas

  • Education
  • Social Sciences (miscellaneous)

Cite this

@article{f627175f12d24780933c7c5adb59b350,
title = "Online Calibration of a Joint Model of Item Responses and Response Times in Computerized Adaptive Testing",
abstract = "With the widespread use of computers in modern assessment, online calibration has become increasingly popular as a way of replenishing an item pool. The present study discusses online calibration strategies for a joint model of responses and response times. The study proposes likelihood inference methods for item paramter estimation and evaluates their performance along with optimal sampling procedures. An extensive simulation study indicates that the proposed online calibration strategies perform well with relatively small samples (e.g., 500∼800 examinees). The analysis of estimated parameters suggests that response time information can be used to improve the recovery of the response model parameters. Among a number of sampling methods investigated, A-optimal sampling was found most advantageous when the item parameters were weakly correlated. When the parameters were strongly correlated, D-optimal sampling tended to achieve the most accurate parameter recovery. The study provides guidelines for deciding sampling design under a specific goal of online calibration given the characteristics of field-testing items.",
keywords = "computerized adaptive testing, item response theory, online calibration, optimal sampling, response time",
author = "Kang, {Hyeon Ah} and Yi Zheng and Chang, {Hua Hua}",
year = "2019",
month = "1",
day = "1",
doi = "10.3102/1076998619879040",
language = "English (US)",
journal = "Journal of Educational and Behavioral Statistics",
issn = "1076-9986",
publisher = "SAGE Publications Inc.",

}

TY - JOUR

T1 - Online Calibration of a Joint Model of Item Responses and Response Times in Computerized Adaptive Testing

AU - Kang, Hyeon Ah

AU - Zheng, Yi

AU - Chang, Hua Hua

PY - 2019/1/1

Y1 - 2019/1/1

N2 - With the widespread use of computers in modern assessment, online calibration has become increasingly popular as a way of replenishing an item pool. The present study discusses online calibration strategies for a joint model of responses and response times. The study proposes likelihood inference methods for item paramter estimation and evaluates their performance along with optimal sampling procedures. An extensive simulation study indicates that the proposed online calibration strategies perform well with relatively small samples (e.g., 500∼800 examinees). The analysis of estimated parameters suggests that response time information can be used to improve the recovery of the response model parameters. Among a number of sampling methods investigated, A-optimal sampling was found most advantageous when the item parameters were weakly correlated. When the parameters were strongly correlated, D-optimal sampling tended to achieve the most accurate parameter recovery. The study provides guidelines for deciding sampling design under a specific goal of online calibration given the characteristics of field-testing items.

AB - With the widespread use of computers in modern assessment, online calibration has become increasingly popular as a way of replenishing an item pool. The present study discusses online calibration strategies for a joint model of responses and response times. The study proposes likelihood inference methods for item paramter estimation and evaluates their performance along with optimal sampling procedures. An extensive simulation study indicates that the proposed online calibration strategies perform well with relatively small samples (e.g., 500∼800 examinees). The analysis of estimated parameters suggests that response time information can be used to improve the recovery of the response model parameters. Among a number of sampling methods investigated, A-optimal sampling was found most advantageous when the item parameters were weakly correlated. When the parameters were strongly correlated, D-optimal sampling tended to achieve the most accurate parameter recovery. The study provides guidelines for deciding sampling design under a specific goal of online calibration given the characteristics of field-testing items.

KW - computerized adaptive testing

KW - item response theory

KW - online calibration

KW - optimal sampling

KW - response time

UR - http://www.scopus.com/inward/record.url?scp=85074395112&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85074395112&partnerID=8YFLogxK

U2 - 10.3102/1076998619879040

DO - 10.3102/1076998619879040

M3 - Article

AN - SCOPUS:85074395112

JO - Journal of Educational and Behavioral Statistics

JF - Journal of Educational and Behavioral Statistics

SN - 1076-9986

ER -