Abstract
The large number of parameters in deep neural networks (DNNs) often makes them prohibitive for low-power devices, such as field-programmable gate arrays (FPGA). In this paper, we propose a method to determine the relative importance of all network parameters by measuring the amount of information that the network output carries about each of the parameters - the Fisher Information. Based on the importance ranking, we design a complexity reduction scheme that discards unimportant parameters and assigns more quantization bits to more important parameters. For evaluation, we construct a deep autoencoder and learn a non-linear dimensionality reduction scheme for accelerometer data measuring the gait of individuals with Parkinson's disease. Experimental results confirm that the proposed ranking method can help reduce the complexity of the network with minimal impact on performance.
Original language | English (US) |
---|---|
Title of host publication | 2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Proceedings |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 2647-2651 |
Number of pages | 5 |
Volume | 2016-May |
ISBN (Electronic) | 9781479999880 |
DOIs | |
State | Published - May 18 2016 |
Event | 41st IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Shanghai, China Duration: Mar 20 2016 → Mar 25 2016 |
Other
Other | 41st IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 |
---|---|
Country | China |
City | Shanghai |
Period | 3/20/16 → 3/25/16 |
Keywords
- complexity reduction
- deep neural networks
- Fisher Information
- FPGA
- pruning
- quantization
ASJC Scopus subject areas
- Signal Processing
- Software
- Electrical and Electronic Engineering