Abstract

The resistive cross-point array architecture has been proposed for on-chip implementation of weighted sum and weight update operations in neuro-inspired learning algorithms. However, several limiting factors potentially hamper the learning accuracy, including the nonlinearity and device variations in weight update, and the read noise, limited ON/OFF weight ratio and array parasitics in weighted sum. With unsupervised sparse coding as a case study algorithm, this paper employs device-Algorithm co-design methodologies to quantify and mitigate the impact of these non-ideal properties on the accuracy. Our analysis shows that the realistic properties in weight update are tolerable, while those in weighted sum are detrimental to the accuracy. With calibration of realistic synaptic behaviors from experimental data, our study shows that the recognition accuracy of MNIST handwriting digits degrades from ∼96 to ∼30 percent. The strategies to mitigate this accuracy loss include 1) redundant cells to alleviate the impact of device variations; 2) a dummy column to eliminate the off-state current; and 3) selector and larger wire width to reduce IR drop along interconnects. The selector also reduces the leakage power in weight update. With improved properties by these strategies, the accuracy increases back to ∼95 percent, enabling reliable integration of realistic synaptic devices in neuromorphic systems.

Original languageEnglish (US)
Article number7536617
Pages (from-to)257-264
Number of pages8
JournalIEEE Transactions on Multi-Scale Computing Systems
Volume2
Issue number4
DOIs
StatePublished - Oct 1 2016

Keywords

  • cross-point array
  • Machine learning
  • neuro-inspired computing
  • resistive memory
  • selector
  • synaptic device

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Information Systems
  • Hardware and Architecture

Fingerprint Dive into the research topics of 'Design of Resistive Synaptic Array for Implementing On-Chip Sparse Learning'. Together they form a unique fingerprint.

  • Cite this