Mobile 93 Views

by Juwon You on 2020-04-27 20:05:49

Date: 2020. 05. 01 (Fri) 15:00

Locate: EB5. 533

Presenter: Juwon You

Title: Learning bothWeights and Connections for Efficient Neural Networks

Author: Song Han, Jeff Poll, John Tran, William J. Dally

          (Stanford Univ, NVIDIA, NVIDIA, Stanford Univ & NVIDIA)

Abstract: Neural networks are both computationally intensive and memory intensive, making them difficult to deploy on embedded systems. Also, conventional networks fix the architecture before training starts; as a result, training cannot improve the architecture. To address these limitations, we describe a method to reduce the storage and computation required by neural networks by an order of magnitude without affecting their accuracy by learning only the important connections. Our method prunes redundant connections using a three-step method. First, we train the network to learn which connections are important. Next, we prune the unimportant connections. Finally, we retrain the network to fine tune the weights of the remaining connections. On the ImageNet dataset, our method reduced the number of parameters of AlexNet by a factor of 9, from 61 million to 6.7 million, without incurring accuracy loss. Similar experiments with VGG-16 found that the total number of parameters can be reduced by 13, from 138 million to 10.3 million, again with no loss of accuracy.

Paper : NIPS 2015

Article source: //eslab.cnu.ac.kr/en/Mobile/190-Learning-bothWeights-and-Connections-for-Efficient-Neural-Networks.html

Other

General