Brain-inspired computing is an emerging field, which aims to reach brainlike performance in real-time processing of sensory data. The challenges that need to be addressed toward reaching such a computational system include building a compact massively parallel architecture with scalable interconnection devices, ultralow-power consumption, and robust neuromorphic computational schemes for implementation of learning in hardware. In this paper, we discuss programming strategies, material characteristics, and spike schemes, which enable implementation of symmetric and asymmetric synaptic plasticity with devices using phase-change materials. We demonstrate that energy consumption can be optimized by tuning the device operation regime and the spike scheme. Our simulations illustrate that a crossbar array consisting of synaptic devices and neurons can achieve hippocampus-like associative learning with symmetric synapses and sequence learning with asymmetric synapses. Pattern completion for patterns with 50% missing elements is achieved via associative learning with symmetric plasticity. Robustness of learning against input noise, variation in sensory data, and device resistance variation are investigated through simulations.
- Hopfield network
- phase-change materials
- spike-timing-dependent plasticity (STDP)
- synaptic device
ASJC Scopus subject areas
- Electronic, Optical and Magnetic Materials
- Electrical and Electronic Engineering