Covariance pattern mixture models: Eliminating random effects to improve convergence and performance

Daniel McNeish, Jeffrey Harring

Research output: Contribution to journalArticle

Abstract

Growth mixture models (GMMs) are prevalent for modeling unknown population heterogeneity via distinct latent classes. However, GMMs are riddled with convergence issues, often requiring researchers to atheoretically alter the model with cross-class constraints simply to obtain convergence. We discuss how within-class random effects in GMMs exacerbate convergence issues, even though these random effects rarely help answer typical research questions. That is, latent classes provide a discretization of continuous random effects, so including additional random effects within latent classes can unnecessarily complicate the model. These random effects are commonly included in order to properly specify the marginal covariance; however, random effects are inefficient for patterning a covariance matrix, resulting in estimation issues. Such a goal can be achieved more simply through covariance pattern models, which we extend to the mixture model context in this article (covariance pattern mixture models, or CPMMs). We provide evidence from theory, simulation, and an empirical example showing that employing CPMMs (even if they are misspecified) instead of GMMs can circumvent the computational difficulties that can plague GMMs, without sacrificing the ability to answer the types of questions commonly asked in empirical studies. Our results show the advantages of CPMMs with respect to improved class enumeration and less biased class-specific growth trajectories, in addition to their vastly improved convergence rates. The results also show that constraining the covariance parameters across classes in order to bypass convergence issues with GMMs leads to poor results. An extensive software appendix is included to assist researchers in running CPMMs in Mplus.

Original languageEnglish (US)
Pages (from-to)947-979
Number of pages33
JournalBehavior Research Methods
Volume52
Issue number3
DOIs
StatePublished - Jun 1 2020

Keywords

  • Constraints
  • Convergence
  • Finite mixture modeling
  • Growth mixture modeling
  • Latent class analysis

ASJC Scopus subject areas

  • Experimental and Cognitive Psychology
  • Developmental and Educational Psychology
  • Arts and Humanities (miscellaneous)
  • Psychology (miscellaneous)
  • Psychology(all)

Fingerprint Dive into the research topics of 'Covariance pattern mixture models: Eliminating random effects to improve convergence and performance'. Together they form a unique fingerprint.

  • Cite this