A Crowdsourced System for Creating Practice Questions in a Clinical Presentation Medical Curriculum

M. Rick Stone, Marjorie Kinney, Carolyn Chatterton, Robin Pettit

Research output: Contribution to journalArticle

Abstract

Abstract: Overview: Medical students must learn a large amount of information in their first 2 years of medical school. Question banks such as UWorld and COMBANK are a popular method of preparation for national board exams, but it is difficult to have a similar uniform resource to prepare for exams administered by individual medical schools because the curriculum varies from school to school. Project Creation and Implementation: In order to help prepare for course exams, students from the Class of 2017 at A.T. Still University School of Osteopathic Medicine in Arizona (ATSU-SOMA) collaborated to create crowdsourced practice quizzes based on the specific material taught at their school. Google Drive was used to manage sign-up sheets and collect questions, and Blackboard was used to create automatically graded practice quizzes. Methods and Results: Participants were given a survey at the end of their second year of medical school to assess their opinions of the project’s effectiveness. Students indicated that participation in the project helped them feel more confident on exams, improved their ability to write higher-order, clinically based questions, and improved their ability to predict what types of questions would be used on school-administered exams. Participants ranked the crowdsourced practice quizzes as more useful than textbook practice questions and as useful as faculty-written practice quizzes, board question banks, and verbal quizzing in study groups in preparing for school-administered exams. Comparison of study participant course grades and medical-school grade point average suggested the practice quizzes may benefit lower-performing students more than higher-performing students.

Original languageEnglish (US)
Pages (from-to)685-692
Number of pages8
JournalMedical Science Educator
Volume27
Issue number4
DOIs
StatePublished - Dec 1 2017
Externally publishedYes

Fingerprint

Curriculum
Medical Schools
curriculum
quiz
school
Students
Osteopathic Medicine
Textbooks
bank
student
Medical Students
ability
study group
search engine
textbook
medical student
school grade
medicine
participation
resources

Keywords

  • Active learning
  • Collaborative learning
  • Crowdsourcing
  • Practice questions
  • Student-driven review

ASJC Scopus subject areas

  • Medicine (miscellaneous)
  • Education

Cite this

A Crowdsourced System for Creating Practice Questions in a Clinical Presentation Medical Curriculum. / Rick Stone, M.; Kinney, Marjorie; Chatterton, Carolyn; Pettit, Robin.

In: Medical Science Educator, Vol. 27, No. 4, 01.12.2017, p. 685-692.

Research output: Contribution to journalArticle

Rick Stone, M. ; Kinney, Marjorie ; Chatterton, Carolyn ; Pettit, Robin. / A Crowdsourced System for Creating Practice Questions in a Clinical Presentation Medical Curriculum. In: Medical Science Educator. 2017 ; Vol. 27, No. 4. pp. 685-692.
@article{89528dcc2ef54f8e98a681aedaaf843f,
title = "A Crowdsourced System for Creating Practice Questions in a Clinical Presentation Medical Curriculum",
abstract = "Abstract: Overview: Medical students must learn a large amount of information in their first 2 years of medical school. Question banks such as UWorld and COMBANK are a popular method of preparation for national board exams, but it is difficult to have a similar uniform resource to prepare for exams administered by individual medical schools because the curriculum varies from school to school. Project Creation and Implementation: In order to help prepare for course exams, students from the Class of 2017 at A.T. Still University School of Osteopathic Medicine in Arizona (ATSU-SOMA) collaborated to create crowdsourced practice quizzes based on the specific material taught at their school. Google Drive was used to manage sign-up sheets and collect questions, and Blackboard was used to create automatically graded practice quizzes. Methods and Results: Participants were given a survey at the end of their second year of medical school to assess their opinions of the project’s effectiveness. Students indicated that participation in the project helped them feel more confident on exams, improved their ability to write higher-order, clinically based questions, and improved their ability to predict what types of questions would be used on school-administered exams. Participants ranked the crowdsourced practice quizzes as more useful than textbook practice questions and as useful as faculty-written practice quizzes, board question banks, and verbal quizzing in study groups in preparing for school-administered exams. Comparison of study participant course grades and medical-school grade point average suggested the practice quizzes may benefit lower-performing students more than higher-performing students.",
keywords = "Active learning, Collaborative learning, Crowdsourcing, Practice questions, Student-driven review",
author = "{Rick Stone}, M. and Marjorie Kinney and Carolyn Chatterton and Robin Pettit",
year = "2017",
month = "12",
day = "1",
doi = "10.1007/s40670-017-0462-9",
language = "English (US)",
volume = "27",
pages = "685--692",
journal = "Medical Science Educator",
issn = "2156-8650",
publisher = "Springer New York",
number = "4",

}

TY - JOUR

T1 - A Crowdsourced System for Creating Practice Questions in a Clinical Presentation Medical Curriculum

AU - Rick Stone, M.

AU - Kinney, Marjorie

AU - Chatterton, Carolyn

AU - Pettit, Robin

PY - 2017/12/1

Y1 - 2017/12/1

N2 - Abstract: Overview: Medical students must learn a large amount of information in their first 2 years of medical school. Question banks such as UWorld and COMBANK are a popular method of preparation for national board exams, but it is difficult to have a similar uniform resource to prepare for exams administered by individual medical schools because the curriculum varies from school to school. Project Creation and Implementation: In order to help prepare for course exams, students from the Class of 2017 at A.T. Still University School of Osteopathic Medicine in Arizona (ATSU-SOMA) collaborated to create crowdsourced practice quizzes based on the specific material taught at their school. Google Drive was used to manage sign-up sheets and collect questions, and Blackboard was used to create automatically graded practice quizzes. Methods and Results: Participants were given a survey at the end of their second year of medical school to assess their opinions of the project’s effectiveness. Students indicated that participation in the project helped them feel more confident on exams, improved their ability to write higher-order, clinically based questions, and improved their ability to predict what types of questions would be used on school-administered exams. Participants ranked the crowdsourced practice quizzes as more useful than textbook practice questions and as useful as faculty-written practice quizzes, board question banks, and verbal quizzing in study groups in preparing for school-administered exams. Comparison of study participant course grades and medical-school grade point average suggested the practice quizzes may benefit lower-performing students more than higher-performing students.

AB - Abstract: Overview: Medical students must learn a large amount of information in their first 2 years of medical school. Question banks such as UWorld and COMBANK are a popular method of preparation for national board exams, but it is difficult to have a similar uniform resource to prepare for exams administered by individual medical schools because the curriculum varies from school to school. Project Creation and Implementation: In order to help prepare for course exams, students from the Class of 2017 at A.T. Still University School of Osteopathic Medicine in Arizona (ATSU-SOMA) collaborated to create crowdsourced practice quizzes based on the specific material taught at their school. Google Drive was used to manage sign-up sheets and collect questions, and Blackboard was used to create automatically graded practice quizzes. Methods and Results: Participants were given a survey at the end of their second year of medical school to assess their opinions of the project’s effectiveness. Students indicated that participation in the project helped them feel more confident on exams, improved their ability to write higher-order, clinically based questions, and improved their ability to predict what types of questions would be used on school-administered exams. Participants ranked the crowdsourced practice quizzes as more useful than textbook practice questions and as useful as faculty-written practice quizzes, board question banks, and verbal quizzing in study groups in preparing for school-administered exams. Comparison of study participant course grades and medical-school grade point average suggested the practice quizzes may benefit lower-performing students more than higher-performing students.

KW - Active learning

KW - Collaborative learning

KW - Crowdsourcing

KW - Practice questions

KW - Student-driven review

UR - http://www.scopus.com/inward/record.url?scp=85061910924&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85061910924&partnerID=8YFLogxK

U2 - 10.1007/s40670-017-0462-9

DO - 10.1007/s40670-017-0462-9

M3 - Article

VL - 27

SP - 685

EP - 692

JO - Medical Science Educator

JF - Medical Science Educator

SN - 2156-8650

IS - 4

ER -