Researcher-centered design of statistics: Why Bayesian statistics better fit the culture and incentives of HCI

Matthew Kay, Gregory L. Nelson, Eric B. Hekler

Research output: Chapter in Book/Report/Conference proceedingConference contribution

22 Citations (Scopus)

Abstract

A core tradition of HCI lies in the experimental evaluation of the effects of techniques and interfaces to determine if they are useful for achieving their purpose. However, our individual analyses tend to stand alone, and study results rarely accrue in more precise estimates via meta-analysis: in a literature search, we found only 56 meta-analyses in HCI in the ACM Digital Library, 3 of which were published at CHI (often called the top HCI venue). Yet meta-analysis is the gold standard for demonstrating robust quantitative knowledge. We treat this as a user-centered design problem: the failure to accrue quantitative knowledge is not the users' (i.e. researchers') failure, but a failure to consider those users' needs when designing statistical practice. Using simulation, we compare hypothetical publication worlds following existing frequentist against Bayesian practice. We show that Bayesian analysis yields more precise effects with each new study, facilitating knowledge accrual without traditional meta-analyses. Bayesian practices also allow more principled conclusions from small-n studies of novel techniques. These advantages make Bayesian practices a likely better fit for the culture and incentives of the field. Instead of admonishing ourselves to spend resources on larger studies, we propose using tools that more appropriately analyze small studies and encourage knowledge accrual from one study to the next. We also believe Bayesian methods can be adopted from the bottom up without the need for new incentives for replication or meta-analysis. These techniques offer the potential for a more user-(i.e. researcher-) centered approach to statistical analysis in HCI.

Original languageEnglish (US)
Title of host publicationCHI 2016 - Proceedings, 34th Annual CHI Conference on Human Factors in Computing Systems
PublisherAssociation for Computing Machinery
Pages4521-4532
Number of pages12
ISBN (Electronic)9781450333627
DOIs
StatePublished - May 7 2016
Event34th Annual Conference on Human Factors in Computing Systems, CHI 2016 - San Jose, United States
Duration: May 7 2016May 12 2016

Other

Other34th Annual Conference on Human Factors in Computing Systems, CHI 2016
CountryUnited States
CitySan Jose
Period5/7/165/12/16

Fingerprint

Human computer interaction
Statistics
Digital libraries
Statistical methods

Keywords

  • Bayesian statistics
  • Effect size
  • Estimation
  • Meta-analysis
  • Replication
  • Small studies

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Graphics and Computer-Aided Design
  • Software

Cite this

Kay, M., Nelson, G. L., & Hekler, E. B. (2016). Researcher-centered design of statistics: Why Bayesian statistics better fit the culture and incentives of HCI. In CHI 2016 - Proceedings, 34th Annual CHI Conference on Human Factors in Computing Systems (pp. 4521-4532). Association for Computing Machinery. https://doi.org/10.1145/2858036.2858465

Researcher-centered design of statistics : Why Bayesian statistics better fit the culture and incentives of HCI. / Kay, Matthew; Nelson, Gregory L.; Hekler, Eric B.

CHI 2016 - Proceedings, 34th Annual CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, 2016. p. 4521-4532.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Kay, M, Nelson, GL & Hekler, EB 2016, Researcher-centered design of statistics: Why Bayesian statistics better fit the culture and incentives of HCI. in CHI 2016 - Proceedings, 34th Annual CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, pp. 4521-4532, 34th Annual Conference on Human Factors in Computing Systems, CHI 2016, San Jose, United States, 5/7/16. https://doi.org/10.1145/2858036.2858465
Kay M, Nelson GL, Hekler EB. Researcher-centered design of statistics: Why Bayesian statistics better fit the culture and incentives of HCI. In CHI 2016 - Proceedings, 34th Annual CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery. 2016. p. 4521-4532 https://doi.org/10.1145/2858036.2858465
Kay, Matthew ; Nelson, Gregory L. ; Hekler, Eric B. / Researcher-centered design of statistics : Why Bayesian statistics better fit the culture and incentives of HCI. CHI 2016 - Proceedings, 34th Annual CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, 2016. pp. 4521-4532
@inproceedings{804f00e28cb24a3f8e287da308c521ca,
title = "Researcher-centered design of statistics: Why Bayesian statistics better fit the culture and incentives of HCI",
abstract = "A core tradition of HCI lies in the experimental evaluation of the effects of techniques and interfaces to determine if they are useful for achieving their purpose. However, our individual analyses tend to stand alone, and study results rarely accrue in more precise estimates via meta-analysis: in a literature search, we found only 56 meta-analyses in HCI in the ACM Digital Library, 3 of which were published at CHI (often called the top HCI venue). Yet meta-analysis is the gold standard for demonstrating robust quantitative knowledge. We treat this as a user-centered design problem: the failure to accrue quantitative knowledge is not the users' (i.e. researchers') failure, but a failure to consider those users' needs when designing statistical practice. Using simulation, we compare hypothetical publication worlds following existing frequentist against Bayesian practice. We show that Bayesian analysis yields more precise effects with each new study, facilitating knowledge accrual without traditional meta-analyses. Bayesian practices also allow more principled conclusions from small-n studies of novel techniques. These advantages make Bayesian practices a likely better fit for the culture and incentives of the field. Instead of admonishing ourselves to spend resources on larger studies, we propose using tools that more appropriately analyze small studies and encourage knowledge accrual from one study to the next. We also believe Bayesian methods can be adopted from the bottom up without the need for new incentives for replication or meta-analysis. These techniques offer the potential for a more user-(i.e. researcher-) centered approach to statistical analysis in HCI.",
keywords = "Bayesian statistics, Effect size, Estimation, Meta-analysis, Replication, Small studies",
author = "Matthew Kay and Nelson, {Gregory L.} and Hekler, {Eric B.}",
year = "2016",
month = "5",
day = "7",
doi = "10.1145/2858036.2858465",
language = "English (US)",
pages = "4521--4532",
booktitle = "CHI 2016 - Proceedings, 34th Annual CHI Conference on Human Factors in Computing Systems",
publisher = "Association for Computing Machinery",

}

TY - GEN

T1 - Researcher-centered design of statistics

T2 - Why Bayesian statistics better fit the culture and incentives of HCI

AU - Kay, Matthew

AU - Nelson, Gregory L.

AU - Hekler, Eric B.

PY - 2016/5/7

Y1 - 2016/5/7

N2 - A core tradition of HCI lies in the experimental evaluation of the effects of techniques and interfaces to determine if they are useful for achieving their purpose. However, our individual analyses tend to stand alone, and study results rarely accrue in more precise estimates via meta-analysis: in a literature search, we found only 56 meta-analyses in HCI in the ACM Digital Library, 3 of which were published at CHI (often called the top HCI venue). Yet meta-analysis is the gold standard for demonstrating robust quantitative knowledge. We treat this as a user-centered design problem: the failure to accrue quantitative knowledge is not the users' (i.e. researchers') failure, but a failure to consider those users' needs when designing statistical practice. Using simulation, we compare hypothetical publication worlds following existing frequentist against Bayesian practice. We show that Bayesian analysis yields more precise effects with each new study, facilitating knowledge accrual without traditional meta-analyses. Bayesian practices also allow more principled conclusions from small-n studies of novel techniques. These advantages make Bayesian practices a likely better fit for the culture and incentives of the field. Instead of admonishing ourselves to spend resources on larger studies, we propose using tools that more appropriately analyze small studies and encourage knowledge accrual from one study to the next. We also believe Bayesian methods can be adopted from the bottom up without the need for new incentives for replication or meta-analysis. These techniques offer the potential for a more user-(i.e. researcher-) centered approach to statistical analysis in HCI.

AB - A core tradition of HCI lies in the experimental evaluation of the effects of techniques and interfaces to determine if they are useful for achieving their purpose. However, our individual analyses tend to stand alone, and study results rarely accrue in more precise estimates via meta-analysis: in a literature search, we found only 56 meta-analyses in HCI in the ACM Digital Library, 3 of which were published at CHI (often called the top HCI venue). Yet meta-analysis is the gold standard for demonstrating robust quantitative knowledge. We treat this as a user-centered design problem: the failure to accrue quantitative knowledge is not the users' (i.e. researchers') failure, but a failure to consider those users' needs when designing statistical practice. Using simulation, we compare hypothetical publication worlds following existing frequentist against Bayesian practice. We show that Bayesian analysis yields more precise effects with each new study, facilitating knowledge accrual without traditional meta-analyses. Bayesian practices also allow more principled conclusions from small-n studies of novel techniques. These advantages make Bayesian practices a likely better fit for the culture and incentives of the field. Instead of admonishing ourselves to spend resources on larger studies, we propose using tools that more appropriately analyze small studies and encourage knowledge accrual from one study to the next. We also believe Bayesian methods can be adopted from the bottom up without the need for new incentives for replication or meta-analysis. These techniques offer the potential for a more user-(i.e. researcher-) centered approach to statistical analysis in HCI.

KW - Bayesian statistics

KW - Effect size

KW - Estimation

KW - Meta-analysis

KW - Replication

KW - Small studies

UR - http://www.scopus.com/inward/record.url?scp=85014682403&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85014682403&partnerID=8YFLogxK

U2 - 10.1145/2858036.2858465

DO - 10.1145/2858036.2858465

M3 - Conference contribution

AN - SCOPUS:85014682403

SP - 4521

EP - 4532

BT - CHI 2016 - Proceedings, 34th Annual CHI Conference on Human Factors in Computing Systems

PB - Association for Computing Machinery

ER -