Abstract
The present study examined the impact of augmented-reality visualization, in comparison to conventional ultrasound (CUS), on the learning of ultrasound-guided needle insertion. Whereas CUS requires cognitive processes for localizing targets, our augmented-reality device, called the sonic flashlight (SF) enables direct perceptual guidance. Participants guided a needle to an ultrasound-localized target within opaque fluid. In three experiments, the SF showed higher accuracy and lower variability in aiming and endpoint placements than did CUS. The SF, but not CUS, readily transferred to new targets and starting points for action. These effects were evident in visually guided action (needle and target continuously visible) and visually directed action (target alone visible). The results have application to learning to visualize surgical targets through ultrasound.
Original language | English (US) |
---|---|
Article number | 1 |
Journal | ACM Transactions on Applied Perception |
Volume | 5 |
Issue number | 1 |
DOIs | |
State | Published - Jan 2 2008 |
Externally published | Yes |
Keywords
- Augmented reality
- Learning
- Motor control
- Perception
- Spatial cognition
ASJC Scopus subject areas
- Theoretical Computer Science
- Computer Science(all)
- Experimental and Cognitive Psychology