TY - JOUR
T1 - Design and Development
T2 - 2021 ASEE Virtual Annual Conference, ASEE 2021
AU - Zhao, Zhen
AU - Carberry, Adam R.
AU - Larson, Jean S.
AU - Jordan, Michelle
AU - Savenye, Wilhelmina
AU - Eustice, Kristi L.
AU - Godwin, Allison
AU - Roehrig, Gillian
AU - Barr, Christopher
AU - Farnsworth, Kimberly
N1 - Funding Information:
This work is supported by the National Science Foundation Grant EEC-2023275. Any opinions, findings, and conclusions, or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. We would also like to thank the leadership team, education team, and the evaluation team of all partner ERCs - ATP-Bio, CBBG, CISTAR, PATHS-UP, QESST, and NEWT - for their support and participation in knowledge sharing, data collection, and offering constructive feedback.
Publisher Copyright:
© American Society for Engineering Education, 2021
PY - 2021/7/26
Y1 - 2021/7/26
N2 - National Science Foundation (NSF) funded Engineering Research Centers (ERC) must complement their technical research with various education and outreach opportunities to promote society's engineering participation and advocate collaboration between industry and academia. ERCs ought to perform an adequate evaluation of their educational and outreach programs to ensure that such beneficial goals are met. This activity is done with full autonomy, which allows each ERC to design and implement its evaluation processes and tools in total isolation. The evaluation tools used by individual ERCs are often quite similar suggesting that these isolated efforts have produced redundant resources that cannot be easily compared due to minor nuances and differences across tools. Single ERC-based evaluation also lacks the sample size to truly test the validity of any evaluation tools. Leaders, educators, and evaluators from six different ERCs are leading a collaborative effort to address the stated issues by building a suite of common evaluation instruments for use by current and future ERCs as well as other similarly structured STEM research centers. A common suite of instruments across ERCs would provide an opportunity to not only streamline ERC education evaluation, but also conduct large-scale assessment studies. This project aims to develop five major deliverables: 1) a common quantitative assessment instrument, 2) a web-based evaluation platform for the quantitative instrument, 3) a set of qualitative instruments, 4) an updated NSF ERC Best Practices Manual, and 5) supplemental resources within a new “ERC evaluator toolbox”.
AB - National Science Foundation (NSF) funded Engineering Research Centers (ERC) must complement their technical research with various education and outreach opportunities to promote society's engineering participation and advocate collaboration between industry and academia. ERCs ought to perform an adequate evaluation of their educational and outreach programs to ensure that such beneficial goals are met. This activity is done with full autonomy, which allows each ERC to design and implement its evaluation processes and tools in total isolation. The evaluation tools used by individual ERCs are often quite similar suggesting that these isolated efforts have produced redundant resources that cannot be easily compared due to minor nuances and differences across tools. Single ERC-based evaluation also lacks the sample size to truly test the validity of any evaluation tools. Leaders, educators, and evaluators from six different ERCs are leading a collaborative effort to address the stated issues by building a suite of common evaluation instruments for use by current and future ERCs as well as other similarly structured STEM research centers. A common suite of instruments across ERCs would provide an opportunity to not only streamline ERC education evaluation, but also conduct large-scale assessment studies. This project aims to develop five major deliverables: 1) a common quantitative assessment instrument, 2) a web-based evaluation platform for the quantitative instrument, 3) a set of qualitative instruments, 4) an updated NSF ERC Best Practices Manual, and 5) supplemental resources within a new “ERC evaluator toolbox”.
UR - http://www.scopus.com/inward/record.url?scp=85124552875&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85124552875&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85124552875
SN - 2153-5965
JO - ASEE Annual Conference and Exposition, Conference Proceedings
JF - ASEE Annual Conference and Exposition, Conference Proceedings
Y2 - 26 July 2021 through 29 July 2021
ER -