Usability evaluation of an experimental text summarization system and three search engines: implications for the reengineering of health care interfaces.

Andre W. Kushniruk, Min Yem Kan, Kathleen McKeown, Judith Klavans, Desmond Jordan, Mark LaFlamme, Vimia L. Patel

Research output: Contribution to journalArticle

11 Scopus citations


This paper describes the comparative evaluation of an experimental automated text summarization system, Centrifuser and three conventional search engines - Google, Yahoo and Centrifuser provides information to patients and families relevant to their questions about specific health conditions. It then produces a multidocument summary of articles retrieved by a standard search engine, tailored to the user's question. Subjects, consisting of friends or family of hospitalized patients, were asked to "think aloud" as they interacted with the four systems. The evaluation involved audio- and video recording of subject interactions with the interfaces in situ at a hospital. Results of the evaluation show that subjects found Centrifuser's summarization capability useful and easy to understand. In comparing Centrifuser to the three search engines, subjects' ratings varied; however, specific interface features were deemed useful across interfaces. We conclude with a discussion of the implications for engineering Web-based retrieval systems.

Original languageEnglish (US)
Pages (from-to)420-424
Number of pages5
JournalProceedings / AMIA ... Annual Symposium. AMIA Symposium
StatePublished - 2002


ASJC Scopus subject areas

  • Medicine(all)

Cite this