The last decade has seen remarkable advances in upper-limb neuroprosthetics. Several groups have developed algorithms that enable control of devices by decoding the neural signals recorded over an array of electrodes. These advances have encouraged researchers to move onto control of neuroprosthetic hands, which faces two complications: (1) hands are geometrically far more complex than arms, (2) hands are also sensitive and sophisticated sensory systems. In this chapter, we review the role of tactile and proprioceptive sensation in hand function, with a focus on the integration of multiple inputs to extract information about our haptic interactions with objects. We argue here that creating a seamless somatosensory prosthetic system will require both a detailed understanding of how individual deformations of the skin result in modulation of neurons in primary somatosensory cortex, but also how those signals are combined to create a somatosensory image.
ASJC Scopus subject areas
- Electrical and Electronic Engineering
- Artificial Intelligence