Zhuo's Blog

  • Home

  • About

  • Tags

  • Categories

  • Archives

  • Search

Quick read: methods of network compression in 2019

Posted on 2019-06-22 | Edited on 2019-07-07 | In Paper reading | Comments:

Overview

Let’s quickly go through the new models related to network compression published at CVPR 2019, ICLR 2019 and ICML 2019. Some works needs to be read and understood more carefully.

CVPR 2019

CVPR is more kind of tending to solve problems in practical applications, while ICLR and ICML are more close to theoretical explanations.

1. Exploiting Kernel Sparsity and Entropy for Interpretable CNN Compression[1]

  • Institutes: Xiamen University, Peng Cheng Laboratory (Shenzhen, China), Beihang University, Huawei Noahs Ark Lab, University of Buffalo and BestImage of Tencent Technology (Shanghai)

  • Notes

  1. Investigate CNN compression from a novel interpretable perspective. Discover that importance of feature maps depend on sparsity and richness (using the proposed Kernel sparsity and Entropy metric);
Read more »

Going with small and fast networks (1)

Posted on 2019-06-16 | Edited on 2019-06-18 | In Paper reading | Comments:

Overview

In this post, we are going to look at the following neural network models: MobileNet v1[1] & v2[2], SqueezeNet[3], ShuffleNet v1[4] & v2[5], NasNet[6]. We consider the following questions:

  1. What in the world do they look like?

  2. Why are they fast? Why are they small? Which one is better and Why?

  3. Why the authors design them like that?

So, let’s try to solve these doubts step by step.

Read more »
Zhuo ge ge

Zhuo ge ge

Hi, nice to meet you
2 posts
1 categories
2 tags
RSS
GitHub E-Mail
Creative Commons
© 2019 Zhuo ge ge
Powered by Hexo v3.9.0
|
Theme – NexT.Pisces v7.1.2
|
0%