Deep Neural Networks Optimization


Convolutional neural networks have been an active research topic in Computer Vision because they have achieved incredible results in numerous tasks such as objection detection and face verification. Recent works have been exploring the development of architectures, which is a key point for improving the performance in convolutional neural networks. It has been demonstrated that deeper architectures achieve better results. However, they are computationally expensive, present a large number of parameters and consume considerable memory. To handle this problem, recent approaches have proposed pruning methods, which consist of finding and removing unimportant filters in these networks.

The main idea behind pruning neural networks is that there may be a large number of unimportant and redundant neurons that could be eliminated (i.e., the network is exceeding the required capacity). Therefore, it is possible to reduce the network size while maintaining its original performance. Inspired by this idea, we propose a new approach to efficiently remove filters in deep convolutional neural networks based on Partial Least Squares and Variable Importance in Projection to measure the importance of each filter, removing the unimportant (or least important) ones.

Experimental results show that the proposed method is able to reduce up to 88% of the floating point operations (FLOPs) without penalizing network accuracy. With a negligible drop in accuracy, we can reduce up to 92% of FLOPs. Furthermore, there are cases where the method is able to improve network accuracy. The table below summarizes our main results.


Method FLOPs↓ Acc↓

VGG16 on


Hu et al. 28.29 -0.66
Li et al. 34.00 -0.10
Huang et al. 64.70 1.90
Ours 67.25 -0.63
ResNet56 on


Huang et al. 64.70 1.70
Yu et al. 43.61 0.03
He et al. 50.00 0.90
Ours 48.01 0.34
VGG16 on


Li et al. 20.00 14.60
Wang et al. 20.00 2.00
He et al. 20.00 1.40
Ours 36.03 1.06


You should cite the following paper if you use this software in your work.

Sorry, no publications matched your criteria.