Design Space Exploration of Neural Network Activation Function Circuits

Tao Yang, Yadong Wei, Zhijun Tu, Haolun Zeng, Michel A. Kinsy, Nanning Zheng, Pengju Ren

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

The widespread application of artificial neural networks has prompted researchers to experiment with field-programmable gate array and customized ASIC designs to speed up their computation. These implementation efforts have generally focused on weight multiplication and signal summation operations, and less on activation functions used in these applications. Yet, efficient hardware implementations of nonlinear activation functions like exponential linear units (ELU), scaled ELU (SELU), and hyperbolic tangent (tanh), are central to designing effective neural network accelerators, since these functions require lots of resources. In this paper, we explore efficient hardware implementations of activation functions using purely combinational circuits, with a focus on two widely used nonlinear activation functions, i.e., SELU and tanh. Our experiments demonstrate that neural networks are generally insensitive to the precision of the activation function. The results also prove that the proposed combinational circuit-based approach is very efficient in terms of speed and area, with negligible accuracy loss on the MNIST, CIFAR-10, and IMAGE NET benchmarks. Synopsys design compiler synthesis results show that circuit designs for tanh and SELU can save between {\times 3.13\sim \times 7.69} and { {\times 4.45\sim \times 8.45}} area compared to the look-up table/memory-based implementations, and can operate at 5.14 GHz and 4.52 GHz using the 28-nm SVT library, respectively. The implementation is available at: https://github.com/ThomasMrY/ActivationFunctionDemo.

Original languageEnglish (US)
Article number8467987
Pages (from-to)1974-1978
Number of pages5
JournalIEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
Volume38
Issue number10
DOIs
StatePublished - Oct 2019
Externally publishedYes

Keywords

  • Activation functions
  • artificial neural networks (ANNs)
  • exponential linear units (ELUs)
  • hyperbolic tangent (tanh)
  • scaled ELUs (SELUs)

ASJC Scopus subject areas

  • Software
  • Computer Graphics and Computer-Aided Design
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Design Space Exploration of Neural Network Activation Function Circuits'. Together they form a unique fingerprint.

Cite this