Statistics for program assessment: Has the program made a difference?

Mary R. Anderson-Rowland

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

As funding becomes scarcer and the demand for accountability increases, creditable assessment and evaluation become more important. For example, funding is generally scarce for programs to establish and to improve activities designed to increase enrollment and retention in engineering. Therefore, almost all funding allocated to these recruitment and retention activities requires an assessment of the program to see if the money and time have been well spent. This paper describes basic statistical concepts that should be considered when assessing a program or activity. Examples are given to illustrate both good and poor program assessment. Warnings are given for data that may turn out to be useless and suggestions presented on ways to enhance data presentation. What it takes for data to be "significant" will also be discussed, as well as the problem of sample size. Without the proper planning of assessments and data collection, it may be very difficult to show that the program has made a difference. If a program director does not have a good statistical background, they would be well advised to have an assessment person on their team to help plan assessment strategy, to analyze the data, and to draw conclusions.

Original languageEnglish (US)
Title of host publicationASEE Annual Conference Proceedings
Pages5503-5510
Number of pages8
StatePublished - 2002
Event2002 ASEE Annual Conference and Exposition: Vive L'ingenieur - Montreal, Que., Canada
Duration: Jun 16 2002Jun 19 2002

Other

Other2002 ASEE Annual Conference and Exposition: Vive L'ingenieur
CountryCanada
CityMontreal, Que.
Period6/16/026/19/02

Keywords

  • Assessment
  • Data Analysis
  • Evaluation
  • Statistical Testing

ASJC Scopus subject areas

  • Engineering(all)

Fingerprint Dive into the research topics of 'Statistics for program assessment: Has the program made a difference?'. Together they form a unique fingerprint.

  • Cite this

    Anderson-Rowland, M. R. (2002). Statistics for program assessment: Has the program made a difference? In ASEE Annual Conference Proceedings (pp. 5503-5510)