1 Citation (Scopus)

Abstract

Emerging non-volatile memory (eNVM) based resistive synaptic devices have shown great potential for implementing deep neural networks (DNNs). However, the eNVM devices typically suffer from various non-ideal effects which may degrade the performance of the system. Based on a representative convolutional neural network (CNN) model for CIFAR-10 dataset, this paper comprehensively investigates the impact of those non-ideal characteristics, such as nonlinearity and asymmetry of conductance tuning, variations, endurance and retention, on the training/inference accuracy. The compact models of the device non-ideal effects are incorporated into the TensorFlow framework. Our simulation results suggest that 1) the training accuracy is more sensitive to the asymmetry of conductance tuning than the nonlinearity; 2) the conductance range variation does not degrade the training accuracy, instead, a small variation can even reduce the accuracy loss introduced by asymmetry; 3) device-to-device variation can also remedy the accuracy loss due to asymmetry while cycle-to-cycle variation leads to significant accuracy degradation; 4) The accuracy degradation will not be noticeable if the endurance cycles are more than 7,000 cycles; 5) Different drifting modes affect the inference accuracy differently, and the best case is where the conductance is drifting up/down randomly.

Original languageEnglish (US)
JournalIEEE Journal on Emerging and Selected Topics in Circuits and Systems
DOIs
StateAccepted/In press - Jan 1 2019

Fingerprint

Neural networks
Durability
Tuning
Data storage equipment
Degradation
Deep neural networks

Keywords

  • Convolutional neural networks
  • deep neural networks
  • Degradation
  • Emerging non-volatile memory
  • in-situ training
  • inference
  • Random access memory
  • reliability
  • synaptic devices
  • System-on-chip
  • Training
  • Tuning
  • variation

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Cite this

@article{ff9bc98e0cce4ec2a07c099e15d95ddd,
title = "Impact of Non-ideal Characteristics of Resistive Synaptic Devices on Implementing Convolutional Neural Networks",
abstract = "Emerging non-volatile memory (eNVM) based resistive synaptic devices have shown great potential for implementing deep neural networks (DNNs). However, the eNVM devices typically suffer from various non-ideal effects which may degrade the performance of the system. Based on a representative convolutional neural network (CNN) model for CIFAR-10 dataset, this paper comprehensively investigates the impact of those non-ideal characteristics, such as nonlinearity and asymmetry of conductance tuning, variations, endurance and retention, on the training/inference accuracy. The compact models of the device non-ideal effects are incorporated into the TensorFlow framework. Our simulation results suggest that 1) the training accuracy is more sensitive to the asymmetry of conductance tuning than the nonlinearity; 2) the conductance range variation does not degrade the training accuracy, instead, a small variation can even reduce the accuracy loss introduced by asymmetry; 3) device-to-device variation can also remedy the accuracy loss due to asymmetry while cycle-to-cycle variation leads to significant accuracy degradation; 4) The accuracy degradation will not be noticeable if the endurance cycles are more than 7,000 cycles; 5) Different drifting modes affect the inference accuracy differently, and the best case is where the conductance is drifting up/down randomly.",
keywords = "Convolutional neural networks, deep neural networks, Degradation, Emerging non-volatile memory, in-situ training, inference, Random access memory, reliability, synaptic devices, System-on-chip, Training, Tuning, variation",
author = "Xiaoyu Sun and Shimeng Yu",
year = "2019",
month = "1",
day = "1",
doi = "10.1109/JETCAS.2019.2933148",
language = "English (US)",
journal = "IEEE Journal on Emerging and Selected Topics in Circuits and Systems",
issn = "2156-3357",
publisher = "IEEE Circuits and Systems Society",

}

TY - JOUR

T1 - Impact of Non-ideal Characteristics of Resistive Synaptic Devices on Implementing Convolutional Neural Networks

AU - Sun, Xiaoyu

AU - Yu, Shimeng

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Emerging non-volatile memory (eNVM) based resistive synaptic devices have shown great potential for implementing deep neural networks (DNNs). However, the eNVM devices typically suffer from various non-ideal effects which may degrade the performance of the system. Based on a representative convolutional neural network (CNN) model for CIFAR-10 dataset, this paper comprehensively investigates the impact of those non-ideal characteristics, such as nonlinearity and asymmetry of conductance tuning, variations, endurance and retention, on the training/inference accuracy. The compact models of the device non-ideal effects are incorporated into the TensorFlow framework. Our simulation results suggest that 1) the training accuracy is more sensitive to the asymmetry of conductance tuning than the nonlinearity; 2) the conductance range variation does not degrade the training accuracy, instead, a small variation can even reduce the accuracy loss introduced by asymmetry; 3) device-to-device variation can also remedy the accuracy loss due to asymmetry while cycle-to-cycle variation leads to significant accuracy degradation; 4) The accuracy degradation will not be noticeable if the endurance cycles are more than 7,000 cycles; 5) Different drifting modes affect the inference accuracy differently, and the best case is where the conductance is drifting up/down randomly.

AB - Emerging non-volatile memory (eNVM) based resistive synaptic devices have shown great potential for implementing deep neural networks (DNNs). However, the eNVM devices typically suffer from various non-ideal effects which may degrade the performance of the system. Based on a representative convolutional neural network (CNN) model for CIFAR-10 dataset, this paper comprehensively investigates the impact of those non-ideal characteristics, such as nonlinearity and asymmetry of conductance tuning, variations, endurance and retention, on the training/inference accuracy. The compact models of the device non-ideal effects are incorporated into the TensorFlow framework. Our simulation results suggest that 1) the training accuracy is more sensitive to the asymmetry of conductance tuning than the nonlinearity; 2) the conductance range variation does not degrade the training accuracy, instead, a small variation can even reduce the accuracy loss introduced by asymmetry; 3) device-to-device variation can also remedy the accuracy loss due to asymmetry while cycle-to-cycle variation leads to significant accuracy degradation; 4) The accuracy degradation will not be noticeable if the endurance cycles are more than 7,000 cycles; 5) Different drifting modes affect the inference accuracy differently, and the best case is where the conductance is drifting up/down randomly.

KW - Convolutional neural networks

KW - deep neural networks

KW - Degradation

KW - Emerging non-volatile memory

KW - in-situ training

KW - inference

KW - Random access memory

KW - reliability

KW - synaptic devices

KW - System-on-chip

KW - Training

KW - Tuning

KW - variation

UR - http://www.scopus.com/inward/record.url?scp=85070706192&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85070706192&partnerID=8YFLogxK

U2 - 10.1109/JETCAS.2019.2933148

DO - 10.1109/JETCAS.2019.2933148

M3 - Article

AN - SCOPUS:85070706192

JO - IEEE Journal on Emerging and Selected Topics in Circuits and Systems

JF - IEEE Journal on Emerging and Selected Topics in Circuits and Systems

SN - 2156-3357

ER -