A cognitive evaluation of four online search engines for answering definitional questions posed by physicians

Hong Yu, David Kaufman

Research output: Chapter in Book/Report/Conference proceedingConference contribution

20 Citations (Scopus)

Abstract

The Internet is having a profound impact on physicians' medical decision making. One recent survey of 277 physicians showed that 72% of physicians regularly used the Internet to research medical information and 51% admitted that information from web sites influenced their clinical decisions. This paper describes the first cognitive evaluation of four state-of-the-art Internet search engines: Google (i.e., Google and Scholar.Google), MedQA, Onelook, and PubMed for answering definitional questions (i.e., questions with the format of What is X?) posed by physicians. Onelook is a portal for online definitions, and MedQA is a question answering system that automatically generates short texts to answer specific biomedical questions. Our evaluation criteria include quality of answer, ease of use, time spent, and number of actions taken. Our results show that MedQA outperforms Onelook and PubMed in most of the criteria, and that MedQA surpasses Google in time spent and number of actions, two important efficiency criteria. Our results show that Google is the best system for quality of answer and ease of use. We conclude that Google is an effective search engine for medical definitions, and that MedQA exceeds the other search engines in that it provides users direct answers to their questions; while the users of the other search engines have to visit several sites before finding all of the pertinent information.

Original languageEnglish (US)
Title of host publicationPacific Symposium on Biocomputing 2007, PSB 2007
Pages328-339
Number of pages12
StatePublished - 2007
Externally publishedYes
EventPacific Symposium on Biocomputing, PSB 2007 - Maui, HI, United States
Duration: Jan 3 2007Jan 7 2007

Other

OtherPacific Symposium on Biocomputing, PSB 2007
CountryUnited States
CityMaui, HI
Period1/3/071/7/07

Fingerprint

Search Engine
Search engines
Internet
Physicians
PubMed
Biomedical Research
Websites
Decision making

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Biomedical Engineering
  • Medicine(all)

Cite this

Yu, H., & Kaufman, D. (2007). A cognitive evaluation of four online search engines for answering definitional questions posed by physicians. In Pacific Symposium on Biocomputing 2007, PSB 2007 (pp. 328-339)

A cognitive evaluation of four online search engines for answering definitional questions posed by physicians. / Yu, Hong; Kaufman, David.

Pacific Symposium on Biocomputing 2007, PSB 2007. 2007. p. 328-339.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Yu, H & Kaufman, D 2007, A cognitive evaluation of four online search engines for answering definitional questions posed by physicians. in Pacific Symposium on Biocomputing 2007, PSB 2007. pp. 328-339, Pacific Symposium on Biocomputing, PSB 2007, Maui, HI, United States, 1/3/07.
Yu H, Kaufman D. A cognitive evaluation of four online search engines for answering definitional questions posed by physicians. In Pacific Symposium on Biocomputing 2007, PSB 2007. 2007. p. 328-339
Yu, Hong ; Kaufman, David. / A cognitive evaluation of four online search engines for answering definitional questions posed by physicians. Pacific Symposium on Biocomputing 2007, PSB 2007. 2007. pp. 328-339
@inproceedings{cd77572e240741a98534e233eb8c917d,
title = "A cognitive evaluation of four online search engines for answering definitional questions posed by physicians",
abstract = "The Internet is having a profound impact on physicians' medical decision making. One recent survey of 277 physicians showed that 72{\%} of physicians regularly used the Internet to research medical information and 51{\%} admitted that information from web sites influenced their clinical decisions. This paper describes the first cognitive evaluation of four state-of-the-art Internet search engines: Google (i.e., Google and Scholar.Google), MedQA, Onelook, and PubMed for answering definitional questions (i.e., questions with the format of What is X?) posed by physicians. Onelook is a portal for online definitions, and MedQA is a question answering system that automatically generates short texts to answer specific biomedical questions. Our evaluation criteria include quality of answer, ease of use, time spent, and number of actions taken. Our results show that MedQA outperforms Onelook and PubMed in most of the criteria, and that MedQA surpasses Google in time spent and number of actions, two important efficiency criteria. Our results show that Google is the best system for quality of answer and ease of use. We conclude that Google is an effective search engine for medical definitions, and that MedQA exceeds the other search engines in that it provides users direct answers to their questions; while the users of the other search engines have to visit several sites before finding all of the pertinent information.",
author = "Hong Yu and David Kaufman",
year = "2007",
language = "English (US)",
isbn = "9812704175",
pages = "328--339",
booktitle = "Pacific Symposium on Biocomputing 2007, PSB 2007",

}

TY - GEN

T1 - A cognitive evaluation of four online search engines for answering definitional questions posed by physicians

AU - Yu, Hong

AU - Kaufman, David

PY - 2007

Y1 - 2007

N2 - The Internet is having a profound impact on physicians' medical decision making. One recent survey of 277 physicians showed that 72% of physicians regularly used the Internet to research medical information and 51% admitted that information from web sites influenced their clinical decisions. This paper describes the first cognitive evaluation of four state-of-the-art Internet search engines: Google (i.e., Google and Scholar.Google), MedQA, Onelook, and PubMed for answering definitional questions (i.e., questions with the format of What is X?) posed by physicians. Onelook is a portal for online definitions, and MedQA is a question answering system that automatically generates short texts to answer specific biomedical questions. Our evaluation criteria include quality of answer, ease of use, time spent, and number of actions taken. Our results show that MedQA outperforms Onelook and PubMed in most of the criteria, and that MedQA surpasses Google in time spent and number of actions, two important efficiency criteria. Our results show that Google is the best system for quality of answer and ease of use. We conclude that Google is an effective search engine for medical definitions, and that MedQA exceeds the other search engines in that it provides users direct answers to their questions; while the users of the other search engines have to visit several sites before finding all of the pertinent information.

AB - The Internet is having a profound impact on physicians' medical decision making. One recent survey of 277 physicians showed that 72% of physicians regularly used the Internet to research medical information and 51% admitted that information from web sites influenced their clinical decisions. This paper describes the first cognitive evaluation of four state-of-the-art Internet search engines: Google (i.e., Google and Scholar.Google), MedQA, Onelook, and PubMed for answering definitional questions (i.e., questions with the format of What is X?) posed by physicians. Onelook is a portal for online definitions, and MedQA is a question answering system that automatically generates short texts to answer specific biomedical questions. Our evaluation criteria include quality of answer, ease of use, time spent, and number of actions taken. Our results show that MedQA outperforms Onelook and PubMed in most of the criteria, and that MedQA surpasses Google in time spent and number of actions, two important efficiency criteria. Our results show that Google is the best system for quality of answer and ease of use. We conclude that Google is an effective search engine for medical definitions, and that MedQA exceeds the other search engines in that it provides users direct answers to their questions; while the users of the other search engines have to visit several sites before finding all of the pertinent information.

UR - http://www.scopus.com/inward/record.url?scp=38449100055&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=38449100055&partnerID=8YFLogxK

M3 - Conference contribution

C2 - 17990503

AN - SCOPUS:38449100055

SN - 9812704175

SN - 9789812704177

SP - 328

EP - 339

BT - Pacific Symposium on Biocomputing 2007, PSB 2007

ER -