TY - GEN
T1 - Ranking the parameters of deep neural networks using the fisher information
AU - Tu, Ming
AU - Berisha, Visar
AU - Woolf, Martin
AU - Seo, Jae-sun
AU - Cao, Yu
N1 - Funding Information:
This research was supported in part by the Office of Naval Research grant N000141410722 (Berisha) and an ASU-Mayo seed grant.
Publisher Copyright:
© 2016 IEEE.
PY - 2016/5/18
Y1 - 2016/5/18
N2 - The large number of parameters in deep neural networks (DNNs) often makes them prohibitive for low-power devices, such as field-programmable gate arrays (FPGA). In this paper, we propose a method to determine the relative importance of all network parameters by measuring the amount of information that the network output carries about each of the parameters - the Fisher Information. Based on the importance ranking, we design a complexity reduction scheme that discards unimportant parameters and assigns more quantization bits to more important parameters. For evaluation, we construct a deep autoencoder and learn a non-linear dimensionality reduction scheme for accelerometer data measuring the gait of individuals with Parkinson's disease. Experimental results confirm that the proposed ranking method can help reduce the complexity of the network with minimal impact on performance.
AB - The large number of parameters in deep neural networks (DNNs) often makes them prohibitive for low-power devices, such as field-programmable gate arrays (FPGA). In this paper, we propose a method to determine the relative importance of all network parameters by measuring the amount of information that the network output carries about each of the parameters - the Fisher Information. Based on the importance ranking, we design a complexity reduction scheme that discards unimportant parameters and assigns more quantization bits to more important parameters. For evaluation, we construct a deep autoencoder and learn a non-linear dimensionality reduction scheme for accelerometer data measuring the gait of individuals with Parkinson's disease. Experimental results confirm that the proposed ranking method can help reduce the complexity of the network with minimal impact on performance.
KW - FPGA
KW - Fisher Information
KW - complexity reduction
KW - deep neural networks
KW - pruning
KW - quantization
UR - http://www.scopus.com/inward/record.url?scp=84973402931&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84973402931&partnerID=8YFLogxK
U2 - 10.1109/ICASSP.2016.7472157
DO - 10.1109/ICASSP.2016.7472157
M3 - Conference contribution
AN - SCOPUS:84973402931
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
SP - 2647
EP - 2651
BT - 2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 41st IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016
Y2 - 20 March 2016 through 25 March 2016
ER -