Abstract

The large number of parameters in deep neural networks (DNNs) often makes them prohibitive for low-power devices, such as field-programmable gate arrays (FPGA). In this paper, we propose a method to determine the relative importance of all network parameters by measuring the amount of information that the network output carries about each of the parameters - the Fisher Information. Based on the importance ranking, we design a complexity reduction scheme that discards unimportant parameters and assigns more quantization bits to more important parameters. For evaluation, we construct a deep autoencoder and learn a non-linear dimensionality reduction scheme for accelerometer data measuring the gait of individuals with Parkinson's disease. Experimental results confirm that the proposed ranking method can help reduce the complexity of the network with minimal impact on performance.

Original languageEnglish (US)
Title of host publication2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2647-2651
Number of pages5
Volume2016-May
ISBN (Electronic)9781479999880
DOIs
StatePublished - May 18 2016
Event41st IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Shanghai, China
Duration: Mar 20 2016Mar 25 2016

Other

Other41st IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016
CountryChina
CityShanghai
Period3/20/163/25/16

Fingerprint

Accelerometers
Field programmable gate arrays (FPGA)
Deep neural networks

Keywords

  • complexity reduction
  • deep neural networks
  • Fisher Information
  • FPGA
  • pruning
  • quantization

ASJC Scopus subject areas

  • Signal Processing
  • Software
  • Electrical and Electronic Engineering

Cite this

Tu, M., Berisha, V., Woolf, M., Seo, J., & Cao, Y. (2016). Ranking the parameters of deep neural networks using the fisher information. In 2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Proceedings (Vol. 2016-May, pp. 2647-2651). [7472157] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICASSP.2016.7472157

Ranking the parameters of deep neural networks using the fisher information. / Tu, Ming; Berisha, Visar; Woolf, Martin; Seo, Jae-sun; Cao, Yu.

2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Proceedings. Vol. 2016-May Institute of Electrical and Electronics Engineers Inc., 2016. p. 2647-2651 7472157.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Tu, M, Berisha, V, Woolf, M, Seo, J & Cao, Y 2016, Ranking the parameters of deep neural networks using the fisher information. in 2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Proceedings. vol. 2016-May, 7472157, Institute of Electrical and Electronics Engineers Inc., pp. 2647-2651, 41st IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016, Shanghai, China, 3/20/16. https://doi.org/10.1109/ICASSP.2016.7472157
Tu M, Berisha V, Woolf M, Seo J, Cao Y. Ranking the parameters of deep neural networks using the fisher information. In 2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Proceedings. Vol. 2016-May. Institute of Electrical and Electronics Engineers Inc. 2016. p. 2647-2651. 7472157 https://doi.org/10.1109/ICASSP.2016.7472157
Tu, Ming ; Berisha, Visar ; Woolf, Martin ; Seo, Jae-sun ; Cao, Yu. / Ranking the parameters of deep neural networks using the fisher information. 2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Proceedings. Vol. 2016-May Institute of Electrical and Electronics Engineers Inc., 2016. pp. 2647-2651
@inproceedings{272a5f547fe54589bef83d66524c50c5,
title = "Ranking the parameters of deep neural networks using the fisher information",
abstract = "The large number of parameters in deep neural networks (DNNs) often makes them prohibitive for low-power devices, such as field-programmable gate arrays (FPGA). In this paper, we propose a method to determine the relative importance of all network parameters by measuring the amount of information that the network output carries about each of the parameters - the Fisher Information. Based on the importance ranking, we design a complexity reduction scheme that discards unimportant parameters and assigns more quantization bits to more important parameters. For evaluation, we construct a deep autoencoder and learn a non-linear dimensionality reduction scheme for accelerometer data measuring the gait of individuals with Parkinson's disease. Experimental results confirm that the proposed ranking method can help reduce the complexity of the network with minimal impact on performance.",
keywords = "complexity reduction, deep neural networks, Fisher Information, FPGA, pruning, quantization",
author = "Ming Tu and Visar Berisha and Martin Woolf and Jae-sun Seo and Yu Cao",
year = "2016",
month = "5",
day = "18",
doi = "10.1109/ICASSP.2016.7472157",
language = "English (US)",
volume = "2016-May",
pages = "2647--2651",
booktitle = "2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - Ranking the parameters of deep neural networks using the fisher information

AU - Tu, Ming

AU - Berisha, Visar

AU - Woolf, Martin

AU - Seo, Jae-sun

AU - Cao, Yu

PY - 2016/5/18

Y1 - 2016/5/18

N2 - The large number of parameters in deep neural networks (DNNs) often makes them prohibitive for low-power devices, such as field-programmable gate arrays (FPGA). In this paper, we propose a method to determine the relative importance of all network parameters by measuring the amount of information that the network output carries about each of the parameters - the Fisher Information. Based on the importance ranking, we design a complexity reduction scheme that discards unimportant parameters and assigns more quantization bits to more important parameters. For evaluation, we construct a deep autoencoder and learn a non-linear dimensionality reduction scheme for accelerometer data measuring the gait of individuals with Parkinson's disease. Experimental results confirm that the proposed ranking method can help reduce the complexity of the network with minimal impact on performance.

AB - The large number of parameters in deep neural networks (DNNs) often makes them prohibitive for low-power devices, such as field-programmable gate arrays (FPGA). In this paper, we propose a method to determine the relative importance of all network parameters by measuring the amount of information that the network output carries about each of the parameters - the Fisher Information. Based on the importance ranking, we design a complexity reduction scheme that discards unimportant parameters and assigns more quantization bits to more important parameters. For evaluation, we construct a deep autoencoder and learn a non-linear dimensionality reduction scheme for accelerometer data measuring the gait of individuals with Parkinson's disease. Experimental results confirm that the proposed ranking method can help reduce the complexity of the network with minimal impact on performance.

KW - complexity reduction

KW - deep neural networks

KW - Fisher Information

KW - FPGA

KW - pruning

KW - quantization

UR - http://www.scopus.com/inward/record.url?scp=84973402931&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84973402931&partnerID=8YFLogxK

U2 - 10.1109/ICASSP.2016.7472157

DO - 10.1109/ICASSP.2016.7472157

M3 - Conference contribution

AN - SCOPUS:84973402931

VL - 2016-May

SP - 2647

EP - 2651

BT - 2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -