Applied tests of design skills

Divergent thinking data analysis and reliability studies

Jami J. Shah, Roger E. Millsap, Jay Woodward, S. M. Smith

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

A number of cognitive skills relevant to conceptual design were identified. They include Divergent Thinking, Visual Thinking, Spatial Reasoning, Qualitative Reasoning and Problem Formulation. A battery of standardized tests have been developed for these skills. We have previously reported on the contents and rationale for divergent thinking and visual thinking tests. This paper focuses on data collection and detailed statistical analysis of one test, namely the divergent thinking test. This particular test has been given to over 500 engineering students and a smaller number of practicing engineers. It is designed to evaluate four direct measures (fluency, flexibility, originality, quality) and four indirect measures (abstractability, afixability, detailability, decomplexability). The eight questions on the test overlap in some measures and the responses can be used to evaluate several measures independently (e.g., fluency and originality can be evaluated separately from the same idea set). The data on the 23 measured variables were factor analyzed using both exploratory and confirmatory procedures. Two variables were dropped from these exploratory analyses for reasons explained in the paper. For the remaining 21 variables, a four-factor solution with correlated (oblique) factors was deemed the best available solution after examining solutions with more factors. Five of the 21 variables did not load meaningfully on any of the four factors. These indirect measures did not appear to correlate strongly either among themselves, or with the other direct measures. The remaining 16 variables loaded on four factors as follows: The four factors correspond to the different measures belonging to each of the four questions. In other words, the different fluency, flexibility, or originality variables did not form factors limited to these forms of creative thinking. Instead the analyses showed factors associated with the questions themselves (with the exception of questions corresponding to indirect measures). The above four-factor structure was then taken into a confirmatory factor analytic procedure that adjusted for the missing data. After making some adjustments, the above four-factor solution was found to provide a reasonable fit to the data. Estimated correlations among the four factors (F) ranged from a high of .32 for F1 and F2 to a low of .06 for F3 and F4. All factor loadings were statistically significant.

Original languageEnglish (US)
Title of host publicationProceedings of the ASME Design Engineering Technical Conference
Pages367-380
Number of pages14
Volume5
DOIs
StatePublished - 2010
EventASME 2010 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC/CIE2010 - Montreal, QC, Canada
Duration: Aug 15 2010Aug 18 2010

Other

OtherASME 2010 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC/CIE2010
CountryCanada
CityMontreal, QC
Period8/15/108/18/10

Fingerprint

Data analysis
Conceptual design
Statistical methods
Students
Engineers
Design
Skills
Flexibility
Qualitative Reasoning
Spatial Reasoning
Factor Structure
Evaluate
Conceptual Design
Form Factors
Oblique
Missing Data
Battery
Correlate
Exception
Statistical Analysis

ASJC Scopus subject areas

  • Mechanical Engineering
  • Computer Graphics and Computer-Aided Design
  • Computer Science Applications
  • Modeling and Simulation

Cite this

Shah, J. J., Millsap, R. E., Woodward, J., & Smith, S. M. (2010). Applied tests of design skills: Divergent thinking data analysis and reliability studies. In Proceedings of the ASME Design Engineering Technical Conference (Vol. 5, pp. 367-380) https://doi.org/10.1115/DETC2010-28886

Applied tests of design skills : Divergent thinking data analysis and reliability studies. / Shah, Jami J.; Millsap, Roger E.; Woodward, Jay; Smith, S. M.

Proceedings of the ASME Design Engineering Technical Conference. Vol. 5 2010. p. 367-380.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Shah, JJ, Millsap, RE, Woodward, J & Smith, SM 2010, Applied tests of design skills: Divergent thinking data analysis and reliability studies. in Proceedings of the ASME Design Engineering Technical Conference. vol. 5, pp. 367-380, ASME 2010 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC/CIE2010, Montreal, QC, Canada, 8/15/10. https://doi.org/10.1115/DETC2010-28886
Shah JJ, Millsap RE, Woodward J, Smith SM. Applied tests of design skills: Divergent thinking data analysis and reliability studies. In Proceedings of the ASME Design Engineering Technical Conference. Vol. 5. 2010. p. 367-380 https://doi.org/10.1115/DETC2010-28886
Shah, Jami J. ; Millsap, Roger E. ; Woodward, Jay ; Smith, S. M. / Applied tests of design skills : Divergent thinking data analysis and reliability studies. Proceedings of the ASME Design Engineering Technical Conference. Vol. 5 2010. pp. 367-380
@inproceedings{fa538db9c46c4eb88a547802e8f0f712,
title = "Applied tests of design skills: Divergent thinking data analysis and reliability studies",
abstract = "A number of cognitive skills relevant to conceptual design were identified. They include Divergent Thinking, Visual Thinking, Spatial Reasoning, Qualitative Reasoning and Problem Formulation. A battery of standardized tests have been developed for these skills. We have previously reported on the contents and rationale for divergent thinking and visual thinking tests. This paper focuses on data collection and detailed statistical analysis of one test, namely the divergent thinking test. This particular test has been given to over 500 engineering students and a smaller number of practicing engineers. It is designed to evaluate four direct measures (fluency, flexibility, originality, quality) and four indirect measures (abstractability, afixability, detailability, decomplexability). The eight questions on the test overlap in some measures and the responses can be used to evaluate several measures independently (e.g., fluency and originality can be evaluated separately from the same idea set). The data on the 23 measured variables were factor analyzed using both exploratory and confirmatory procedures. Two variables were dropped from these exploratory analyses for reasons explained in the paper. For the remaining 21 variables, a four-factor solution with correlated (oblique) factors was deemed the best available solution after examining solutions with more factors. Five of the 21 variables did not load meaningfully on any of the four factors. These indirect measures did not appear to correlate strongly either among themselves, or with the other direct measures. The remaining 16 variables loaded on four factors as follows: The four factors correspond to the different measures belonging to each of the four questions. In other words, the different fluency, flexibility, or originality variables did not form factors limited to these forms of creative thinking. Instead the analyses showed factors associated with the questions themselves (with the exception of questions corresponding to indirect measures). The above four-factor structure was then taken into a confirmatory factor analytic procedure that adjusted for the missing data. After making some adjustments, the above four-factor solution was found to provide a reasonable fit to the data. Estimated correlations among the four factors (F) ranged from a high of .32 for F1 and F2 to a low of .06 for F3 and F4. All factor loadings were statistically significant.",
author = "Shah, {Jami J.} and Millsap, {Roger E.} and Jay Woodward and Smith, {S. M.}",
year = "2010",
doi = "10.1115/DETC2010-28886",
language = "English (US)",
isbn = "9780791844137",
volume = "5",
pages = "367--380",
booktitle = "Proceedings of the ASME Design Engineering Technical Conference",

}

TY - GEN

T1 - Applied tests of design skills

T2 - Divergent thinking data analysis and reliability studies

AU - Shah, Jami J.

AU - Millsap, Roger E.

AU - Woodward, Jay

AU - Smith, S. M.

PY - 2010

Y1 - 2010

N2 - A number of cognitive skills relevant to conceptual design were identified. They include Divergent Thinking, Visual Thinking, Spatial Reasoning, Qualitative Reasoning and Problem Formulation. A battery of standardized tests have been developed for these skills. We have previously reported on the contents and rationale for divergent thinking and visual thinking tests. This paper focuses on data collection and detailed statistical analysis of one test, namely the divergent thinking test. This particular test has been given to over 500 engineering students and a smaller number of practicing engineers. It is designed to evaluate four direct measures (fluency, flexibility, originality, quality) and four indirect measures (abstractability, afixability, detailability, decomplexability). The eight questions on the test overlap in some measures and the responses can be used to evaluate several measures independently (e.g., fluency and originality can be evaluated separately from the same idea set). The data on the 23 measured variables were factor analyzed using both exploratory and confirmatory procedures. Two variables were dropped from these exploratory analyses for reasons explained in the paper. For the remaining 21 variables, a four-factor solution with correlated (oblique) factors was deemed the best available solution after examining solutions with more factors. Five of the 21 variables did not load meaningfully on any of the four factors. These indirect measures did not appear to correlate strongly either among themselves, or with the other direct measures. The remaining 16 variables loaded on four factors as follows: The four factors correspond to the different measures belonging to each of the four questions. In other words, the different fluency, flexibility, or originality variables did not form factors limited to these forms of creative thinking. Instead the analyses showed factors associated with the questions themselves (with the exception of questions corresponding to indirect measures). The above four-factor structure was then taken into a confirmatory factor analytic procedure that adjusted for the missing data. After making some adjustments, the above four-factor solution was found to provide a reasonable fit to the data. Estimated correlations among the four factors (F) ranged from a high of .32 for F1 and F2 to a low of .06 for F3 and F4. All factor loadings were statistically significant.

AB - A number of cognitive skills relevant to conceptual design were identified. They include Divergent Thinking, Visual Thinking, Spatial Reasoning, Qualitative Reasoning and Problem Formulation. A battery of standardized tests have been developed for these skills. We have previously reported on the contents and rationale for divergent thinking and visual thinking tests. This paper focuses on data collection and detailed statistical analysis of one test, namely the divergent thinking test. This particular test has been given to over 500 engineering students and a smaller number of practicing engineers. It is designed to evaluate four direct measures (fluency, flexibility, originality, quality) and four indirect measures (abstractability, afixability, detailability, decomplexability). The eight questions on the test overlap in some measures and the responses can be used to evaluate several measures independently (e.g., fluency and originality can be evaluated separately from the same idea set). The data on the 23 measured variables were factor analyzed using both exploratory and confirmatory procedures. Two variables were dropped from these exploratory analyses for reasons explained in the paper. For the remaining 21 variables, a four-factor solution with correlated (oblique) factors was deemed the best available solution after examining solutions with more factors. Five of the 21 variables did not load meaningfully on any of the four factors. These indirect measures did not appear to correlate strongly either among themselves, or with the other direct measures. The remaining 16 variables loaded on four factors as follows: The four factors correspond to the different measures belonging to each of the four questions. In other words, the different fluency, flexibility, or originality variables did not form factors limited to these forms of creative thinking. Instead the analyses showed factors associated with the questions themselves (with the exception of questions corresponding to indirect measures). The above four-factor structure was then taken into a confirmatory factor analytic procedure that adjusted for the missing data. After making some adjustments, the above four-factor solution was found to provide a reasonable fit to the data. Estimated correlations among the four factors (F) ranged from a high of .32 for F1 and F2 to a low of .06 for F3 and F4. All factor loadings were statistically significant.

UR - http://www.scopus.com/inward/record.url?scp=80055004390&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=80055004390&partnerID=8YFLogxK

U2 - 10.1115/DETC2010-28886

DO - 10.1115/DETC2010-28886

M3 - Conference contribution

SN - 9780791844137

VL - 5

SP - 367

EP - 380

BT - Proceedings of the ASME Design Engineering Technical Conference

ER -