Visual training for population vector based cortical control

R. Wahnoun, Stephen Helms Tillery, Jiping He

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We have developed a method for training animals to control a variety of devices from cortical signals [14]. In this report we describe a protocol to parameterize a cortical control algorithm without an animal having to move its arm. Instead, a highly motivated animal observes a computer cursor moving towards a set of targets once each in a center-out task. From the neuronal activity recorded in this visual following task, we compute the set of preferred directions for the neurons. We find that the quality of fit in this early set of trials is highly predictive of each neuron's contribution to the overall cortical control.

Original languageEnglish (US)
Title of host publication2005 First International Conference on Neural Interface and Control, Proceedings
Pages213-216
Number of pages4
DOIs
Publication statusPublished - 2005
Event2005 First International Conference on Neural Interface and Control - Wuhan, China
Duration: May 26 2005May 28 2005

Other

Other2005 First International Conference on Neural Interface and Control
CountryChina
CityWuhan
Period5/26/055/28/05

    Fingerprint

Keywords

  • Algorithm
  • Cortical coding
  • Cortical control
  • Motor cortex
  • Neuroprosthetics

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Wahnoun, R., Helms Tillery, S., & He, J. (2005). Visual training for population vector based cortical control. In 2005 First International Conference on Neural Interface and Control, Proceedings (pp. 213-216). [1499880] https://doi.org/10.1109/ICNIC.2005.1499880