Low-power neuromorphic speech recognition engine with coarse-grain sparsity

Shihui Yin, Deepak Kadetotad, Bonan Yan, Chang Song, Yiran Chen, Chaitali Chakrabarti, Jae-sun Seo

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Scopus citations

Abstract

In recent years, we have seen a surge of interest in neuromorphic computing and its hardware design for cognitive applications. In this work, we present new neuromorphic architecture, circuit, and device co-designs that enable spike-based classification for speech recognition task. The proposed neuromorphic speech recognition engine supports a sparsely connected deep spiking network with coarse granularity, leading to large memory reduction with minimal Index information Simulation results show that the proposed deep spiking neural network accelerator achieves phoneme error rate (PER) of 20.5% for TIMIT database, and consume 2.57mW in 40nm CMOS for real-time performance. To alleviate the memory bottleneck, the usage of non-volatile memory is also evaluated and discussed.

Original languageEnglish (US)
Title of host publication2017 22nd Asia and South Pacific Design Automation Conference, ASP-DAC 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages111-114
Number of pages4
ISBN (Electronic)9781509015580
DOIs
StatePublished - Feb 16 2017
Event22nd Asia and South Pacific Design Automation Conference, ASP-DAC 2017 - Chiba, Japan
Duration: Jan 16 2017Jan 19 2017

Publication series

NameProceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC

Other

Other22nd Asia and South Pacific Design Automation Conference, ASP-DAC 2017
Country/TerritoryJapan
CityChiba
Period1/16/171/19/17

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Computer Science Applications
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'Low-power neuromorphic speech recognition engine with coarse-grain sparsity'. Together they form a unique fingerprint.

Cite this