Pssst⋯ Textual features⋯ There is more to automatic essay scoring than just you!

Scott Crossley, Laura K. Allen, Danielle McNamara, Erica L. Snow

Research output: Chapter in Book/Report/Conference proceedingConference contribution

18 Scopus citations

Abstract

This study investigates a new approach to automatically assessing essay quality that combines traditional approaches based on assessing textual features with new approaches that measure student attributes such as demographic information, standardized test scores, and survey results. The results demonstrate that combining both text features and student attributes leads to essay scoring models that are on par with state-of-the-art scoring models. Such findings expand our knowledge of textual and nontextual features that are predictive of writing success.

Original languageEnglish (US)
Title of host publicationACM International Conference Proceeding Series
PublisherAssociation for Computing Machinery
Pages203-207
Number of pages5
Volume16-20-March-2015
ISBN (Print)9781450334174
DOIs
StatePublished - Mar 16 2015
Event5th International Conference on Learning Analytics and Knowledge, LAK 2015 - Poughkeepsie, United States
Duration: Mar 16 2015Mar 20 2015

Other

Other5th International Conference on Learning Analytics and Knowledge, LAK 2015
Country/TerritoryUnited States
CityPoughkeepsie
Period3/16/153/20/15

Keywords

  • Automatic essay scoring
  • Corpus linguistics
  • Data mining
  • Individual differences
  • Intelligent tutoring systems
  • Natural language processing

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Networks and Communications
  • Computer Vision and Pattern Recognition
  • Software

Fingerprint

Dive into the research topics of 'Pssst⋯ Textual features⋯ There is more to automatic essay scoring than just you!'. Together they form a unique fingerprint.

Cite this