Visual training for population vector based cortical control

Jiping He, Stephen Helms Tillery, R. Wahnoun

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We have developed a method for training animals to control a variety of devices from cortical signals. in this report we describe a protocol to parameterize a cortical control algorithm without an animal having to move its arm. Instead, a highly motivated animal observes a computer cursor moving towards a set of targets once each in a center-out task. From the neuronal activity recorded in this visual following task, we compute the set of preferred directions for the neurons. We find that the quality of fit in this early set of trials is highly predictive of each neuron's contribution to the overall cortical control.

Original languageEnglish (US)
Title of host publicationIFAC Proceedings Volumes (IFAC-PapersOnline)
Pages278-282
Number of pages5
Volume16
StatePublished - 2005
Event16th Triennial World Congress of International Federation of Automatic Control, IFAC 2005 - Prague, Czech Republic
Duration: Jul 3 2005Jul 8 2005

Other

Other16th Triennial World Congress of International Federation of Automatic Control, IFAC 2005
CountryCzech Republic
CityPrague
Period7/3/057/8/05

Fingerprint

Animals
Neurons

Keywords

  • Algorithm
  • Cortical coding
  • Cortical control
  • Motor cortex
  • Neuroprosthetics

ASJC Scopus subject areas

  • Control and Systems Engineering

Cite this

He, J., Helms Tillery, S., & Wahnoun, R. (2005). Visual training for population vector based cortical control. In IFAC Proceedings Volumes (IFAC-PapersOnline) (Vol. 16, pp. 278-282)

Visual training for population vector based cortical control. / He, Jiping; Helms Tillery, Stephen; Wahnoun, R.

IFAC Proceedings Volumes (IFAC-PapersOnline). Vol. 16 2005. p. 278-282.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

He, J, Helms Tillery, S & Wahnoun, R 2005, Visual training for population vector based cortical control. in IFAC Proceedings Volumes (IFAC-PapersOnline). vol. 16, pp. 278-282, 16th Triennial World Congress of International Federation of Automatic Control, IFAC 2005, Prague, Czech Republic, 7/3/05.
He J, Helms Tillery S, Wahnoun R. Visual training for population vector based cortical control. In IFAC Proceedings Volumes (IFAC-PapersOnline). Vol. 16. 2005. p. 278-282
He, Jiping ; Helms Tillery, Stephen ; Wahnoun, R. / Visual training for population vector based cortical control. IFAC Proceedings Volumes (IFAC-PapersOnline). Vol. 16 2005. pp. 278-282
@inproceedings{9f01cfdfdeba4e308cb38ace7c227879,
title = "Visual training for population vector based cortical control",
abstract = "We have developed a method for training animals to control a variety of devices from cortical signals. in this report we describe a protocol to parameterize a cortical control algorithm without an animal having to move its arm. Instead, a highly motivated animal observes a computer cursor moving towards a set of targets once each in a center-out task. From the neuronal activity recorded in this visual following task, we compute the set of preferred directions for the neurons. We find that the quality of fit in this early set of trials is highly predictive of each neuron's contribution to the overall cortical control.",
keywords = "Algorithm, Cortical coding, Cortical control, Motor cortex, Neuroprosthetics",
author = "Jiping He and {Helms Tillery}, Stephen and R. Wahnoun",
year = "2005",
language = "English (US)",
isbn = "008045108X",
volume = "16",
pages = "278--282",
booktitle = "IFAC Proceedings Volumes (IFAC-PapersOnline)",

}

TY - GEN

T1 - Visual training for population vector based cortical control

AU - He, Jiping

AU - Helms Tillery, Stephen

AU - Wahnoun, R.

PY - 2005

Y1 - 2005

N2 - We have developed a method for training animals to control a variety of devices from cortical signals. in this report we describe a protocol to parameterize a cortical control algorithm without an animal having to move its arm. Instead, a highly motivated animal observes a computer cursor moving towards a set of targets once each in a center-out task. From the neuronal activity recorded in this visual following task, we compute the set of preferred directions for the neurons. We find that the quality of fit in this early set of trials is highly predictive of each neuron's contribution to the overall cortical control.

AB - We have developed a method for training animals to control a variety of devices from cortical signals. in this report we describe a protocol to parameterize a cortical control algorithm without an animal having to move its arm. Instead, a highly motivated animal observes a computer cursor moving towards a set of targets once each in a center-out task. From the neuronal activity recorded in this visual following task, we compute the set of preferred directions for the neurons. We find that the quality of fit in this early set of trials is highly predictive of each neuron's contribution to the overall cortical control.

KW - Algorithm

KW - Cortical coding

KW - Cortical control

KW - Motor cortex

KW - Neuroprosthetics

UR - http://www.scopus.com/inward/record.url?scp=79960726152&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=79960726152&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:79960726152

SN - 008045108X

SN - 9780080451084

VL - 16

SP - 278

EP - 282

BT - IFAC Proceedings Volumes (IFAC-PapersOnline)

ER -