Exploiting Hybrid Precision for Training and Inference: A 2T-1FeFET Based Analog Synaptic Weight Cell

Xiaoyu Sun, Panni Wang, Kai Ni, Suman Datta, Shimeng Yu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In-memory computing with analog non-volatile memories (NVMs) can accelerate both the in-situ training and inference of deep neural networks (DNNs) by parallelizing multiply-accumulate (MAC) operations in the analog domain. However, the in-situ training accuracy suffers from unacceptable degradation due to undesired weight-update asymmetry/nonlinearity and limited bit precision. In this work, we overcome this challenge by introducing a compact Ferroelectric FET (FeFET) based synaptic cell that exploits hybrid precision for in-situ training and inference. We propose a novel hybrid approach where we use modulated 'volatile' gate voltage of FeFET to represent the least significant bits (LSBs) for symmetric/linear update during training only, and use 'non-volatile' polarization states of FeFET to hold the information of most significant bits (MSBs) for inference. This design is demonstrated by the experimentally validated FeFET SPICE model and cosimulation with the TensorFlow framework. The results show that with the proposed 6-bit and 7-bit synapse design, the insitu training accuracy can achieve ∼97.3% on MNIST dataset and ∼87% on CIFAR-10 dataset, respectively, approaching the ideal software based training.

Original languageEnglish (US)
Title of host publication2018 IEEE International Electron Devices Meeting, IEDM 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3.1.1-3.1.4
ISBN (Electronic)9781728119878
DOIs
StatePublished - Jan 16 2019
Event64th Annual IEEE International Electron Devices Meeting, IEDM 2018 - San Francisco, United States
Duration: Dec 1 2018Dec 5 2018

Publication series

NameTechnical Digest - International Electron Devices Meeting, IEDM
Volume2018-December
ISSN (Print)0163-1918

Conference

Conference64th Annual IEEE International Electron Devices Meeting, IEDM 2018
CountryUnited States
CitySan Francisco
Period12/1/1812/5/18

Fingerprint

inference
education
Ferroelectric materials
analogs
Field effect transistors
cells
field effect transistors
Data storage equipment
synapses
SPICE
Polarization
Degradation
nonlinearity
asymmetry
degradation
computer programs
Electric potential
electric potential
polarization

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Electrical and Electronic Engineering
  • Materials Chemistry

Cite this

Sun, X., Wang, P., Ni, K., Datta, S., & Yu, S. (2019). Exploiting Hybrid Precision for Training and Inference: A 2T-1FeFET Based Analog Synaptic Weight Cell. In 2018 IEEE International Electron Devices Meeting, IEDM 2018 (pp. 3.1.1-3.1.4). [8614611] (Technical Digest - International Electron Devices Meeting, IEDM; Vol. 2018-December). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IEDM.2018.8614611

Exploiting Hybrid Precision for Training and Inference : A 2T-1FeFET Based Analog Synaptic Weight Cell. / Sun, Xiaoyu; Wang, Panni; Ni, Kai; Datta, Suman; Yu, Shimeng.

2018 IEEE International Electron Devices Meeting, IEDM 2018. Institute of Electrical and Electronics Engineers Inc., 2019. p. 3.1.1-3.1.4 8614611 (Technical Digest - International Electron Devices Meeting, IEDM; Vol. 2018-December).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Sun, X, Wang, P, Ni, K, Datta, S & Yu, S 2019, Exploiting Hybrid Precision for Training and Inference: A 2T-1FeFET Based Analog Synaptic Weight Cell. in 2018 IEEE International Electron Devices Meeting, IEDM 2018., 8614611, Technical Digest - International Electron Devices Meeting, IEDM, vol. 2018-December, Institute of Electrical and Electronics Engineers Inc., pp. 3.1.1-3.1.4, 64th Annual IEEE International Electron Devices Meeting, IEDM 2018, San Francisco, United States, 12/1/18. https://doi.org/10.1109/IEDM.2018.8614611
Sun X, Wang P, Ni K, Datta S, Yu S. Exploiting Hybrid Precision for Training and Inference: A 2T-1FeFET Based Analog Synaptic Weight Cell. In 2018 IEEE International Electron Devices Meeting, IEDM 2018. Institute of Electrical and Electronics Engineers Inc. 2019. p. 3.1.1-3.1.4. 8614611. (Technical Digest - International Electron Devices Meeting, IEDM). https://doi.org/10.1109/IEDM.2018.8614611
Sun, Xiaoyu ; Wang, Panni ; Ni, Kai ; Datta, Suman ; Yu, Shimeng. / Exploiting Hybrid Precision for Training and Inference : A 2T-1FeFET Based Analog Synaptic Weight Cell. 2018 IEEE International Electron Devices Meeting, IEDM 2018. Institute of Electrical and Electronics Engineers Inc., 2019. pp. 3.1.1-3.1.4 (Technical Digest - International Electron Devices Meeting, IEDM).
@inproceedings{7f1ae4b79fd74e2a93a3c804081f7dad,
title = "Exploiting Hybrid Precision for Training and Inference: A 2T-1FeFET Based Analog Synaptic Weight Cell",
abstract = "In-memory computing with analog non-volatile memories (NVMs) can accelerate both the in-situ training and inference of deep neural networks (DNNs) by parallelizing multiply-accumulate (MAC) operations in the analog domain. However, the in-situ training accuracy suffers from unacceptable degradation due to undesired weight-update asymmetry/nonlinearity and limited bit precision. In this work, we overcome this challenge by introducing a compact Ferroelectric FET (FeFET) based synaptic cell that exploits hybrid precision for in-situ training and inference. We propose a novel hybrid approach where we use modulated 'volatile' gate voltage of FeFET to represent the least significant bits (LSBs) for symmetric/linear update during training only, and use 'non-volatile' polarization states of FeFET to hold the information of most significant bits (MSBs) for inference. This design is demonstrated by the experimentally validated FeFET SPICE model and cosimulation with the TensorFlow framework. The results show that with the proposed 6-bit and 7-bit synapse design, the insitu training accuracy can achieve ∼97.3{\%} on MNIST dataset and ∼87{\%} on CIFAR-10 dataset, respectively, approaching the ideal software based training.",
author = "Xiaoyu Sun and Panni Wang and Kai Ni and Suman Datta and Shimeng Yu",
year = "2019",
month = "1",
day = "16",
doi = "10.1109/IEDM.2018.8614611",
language = "English (US)",
series = "Technical Digest - International Electron Devices Meeting, IEDM",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "3.1.1--3.1.4",
booktitle = "2018 IEEE International Electron Devices Meeting, IEDM 2018",

}

TY - GEN

T1 - Exploiting Hybrid Precision for Training and Inference

T2 - A 2T-1FeFET Based Analog Synaptic Weight Cell

AU - Sun, Xiaoyu

AU - Wang, Panni

AU - Ni, Kai

AU - Datta, Suman

AU - Yu, Shimeng

PY - 2019/1/16

Y1 - 2019/1/16

N2 - In-memory computing with analog non-volatile memories (NVMs) can accelerate both the in-situ training and inference of deep neural networks (DNNs) by parallelizing multiply-accumulate (MAC) operations in the analog domain. However, the in-situ training accuracy suffers from unacceptable degradation due to undesired weight-update asymmetry/nonlinearity and limited bit precision. In this work, we overcome this challenge by introducing a compact Ferroelectric FET (FeFET) based synaptic cell that exploits hybrid precision for in-situ training and inference. We propose a novel hybrid approach where we use modulated 'volatile' gate voltage of FeFET to represent the least significant bits (LSBs) for symmetric/linear update during training only, and use 'non-volatile' polarization states of FeFET to hold the information of most significant bits (MSBs) for inference. This design is demonstrated by the experimentally validated FeFET SPICE model and cosimulation with the TensorFlow framework. The results show that with the proposed 6-bit and 7-bit synapse design, the insitu training accuracy can achieve ∼97.3% on MNIST dataset and ∼87% on CIFAR-10 dataset, respectively, approaching the ideal software based training.

AB - In-memory computing with analog non-volatile memories (NVMs) can accelerate both the in-situ training and inference of deep neural networks (DNNs) by parallelizing multiply-accumulate (MAC) operations in the analog domain. However, the in-situ training accuracy suffers from unacceptable degradation due to undesired weight-update asymmetry/nonlinearity and limited bit precision. In this work, we overcome this challenge by introducing a compact Ferroelectric FET (FeFET) based synaptic cell that exploits hybrid precision for in-situ training and inference. We propose a novel hybrid approach where we use modulated 'volatile' gate voltage of FeFET to represent the least significant bits (LSBs) for symmetric/linear update during training only, and use 'non-volatile' polarization states of FeFET to hold the information of most significant bits (MSBs) for inference. This design is demonstrated by the experimentally validated FeFET SPICE model and cosimulation with the TensorFlow framework. The results show that with the proposed 6-bit and 7-bit synapse design, the insitu training accuracy can achieve ∼97.3% on MNIST dataset and ∼87% on CIFAR-10 dataset, respectively, approaching the ideal software based training.

UR - http://www.scopus.com/inward/record.url?scp=85061795371&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85061795371&partnerID=8YFLogxK

U2 - 10.1109/IEDM.2018.8614611

DO - 10.1109/IEDM.2018.8614611

M3 - Conference contribution

T3 - Technical Digest - International Electron Devices Meeting, IEDM

SP - 3.1.1-3.1.4

BT - 2018 IEEE International Electron Devices Meeting, IEDM 2018

PB - Institute of Electrical and Electronics Engineers Inc.

ER -