Fully Integrated Analog Machine Learning Classifier Using Custom Activation Function for Low Resolution Image Classification

Sanjeev Tannirkulam Chandrasekaran, Akshay Jayaraj, Vinay Elkoori Ghantala Karnam, Imon Banerjee, Arindam Sanyal

Research output: Contribution to journalArticlepeer-review

Abstract

This paper presents fully-integrated analog neural network classifier architecture for low resolution image classification that eliminates memory access. We design custom activation functions using single-stage common-source amplifiers, and apply a hardware-software co-design methodology to incorporate knowledge of the custom activation functions into the training phase to achieve high accuracy. Performing all computations entirely in the analog domain eliminates energy cost associated with memory access and data movement. We demonstrate our classifier on multinomial classification task of recognizing down-sampled handwritten digits from MNIST dataset. Fabricated in 65nm CMOS process, the measured energy consumption for down-sampled MNIST dataset is 173pJ/classification, which is 3\times better than state-of-the-art. The prototype IC achieves mean classification accuracy of 81.3% even after down-sampling the original MNIST images by 96% from 28\times 28 pixels to 5\times 5 pixels.

Original languageEnglish (US)
Article number9337886
Pages (from-to)1023-1033
Number of pages11
JournalIEEE Transactions on Circuits and Systems I: Regular Papers
Volume68
Issue number3
DOIs
StatePublished - Mar 2021
Externally publishedYes

Keywords

  • analog neural network
  • custom activation function
  • low resolution image classification
  • Machine learning

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Fully Integrated Analog Machine Learning Classifier Using Custom Activation Function for Low Resolution Image Classification'. Together they form a unique fingerprint.

Cite this