Supporting answerers with feedback in social QandA

John Frens, Erin Walker, Gary Hsieh

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations


Prior research has examined the use of Social Question and Answer (QandA) websites for answer and help seeking. However, the potential for these websites to support domain learning has not yet been realized. Helping users write effective answers can be beneficial for subject area learning for both answerers and the recipients of answers. In this study, we examine the utility of crowdsourced, criteria-based feedback for answerers on a student-centered QandA website, In an experiment with 55 users, we compared perceptions of the current rating system against two feedback designs with explicit criteria (Appropriate, Understandable, and Generalizable). Contrary to our hypotheses, answerers disagreed with and rejected the criteria-based feedback. Although the criteria aligned with answerers' goals, and crowdsourced ratings were found to be objectively accurate, the norms and expectations for answers on Brainly conflicted with our design. We conclude with implications for the design of feedback in social QandA.

Original languageEnglish (US)
Title of host publicationProceedings of the 5th Annual ACM Conference on Learning at Scale, L at S 2018
PublisherAssociation for Computing Machinery, Inc
ISBN (Electronic)9781450358866
StatePublished - Jun 26 2018
Event5th Annual ACM Conference on Learning at Scale, L at S 2018 - London, United Kingdom
Duration: Jun 26 2018Jun 28 2018


Other5th Annual ACM Conference on Learning at Scale, L at S 2018
Country/TerritoryUnited Kingdom


  • Crowd Assessment
  • Feedback
  • Informal Learning
  • Peer Help

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Education
  • Software
  • Computer Science Applications


Dive into the research topics of 'Supporting answerers with feedback in social QandA'. Together they form a unique fingerprint.

Cite this