Filter pruning and re-initialization via latent space clustering

Seunghyun Lee, Byeongho Heo, Jung Woo Ha, Byung Cheol Song

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

Filter pruning is prevalent for pruning-based model compression. Most filter pruning methods have two main issues: 1) the pruned network capability depends on that of source pretrained models, and 2) they do not consider that filter weights follow a normal distribution. To address these issues, we propose a new pruning method employing both weight re-initialization and latent space clustering. For latent space clustering, we define filters and their feature maps as vertices and edges to be a graph, transformed into a latent space by graph convolution, alleviating to prune zero-near weight filters only. In addition, a part of filters is re-initialized with a constraint for enhancing filter diversity, and thus the pruned model is less dependent on the source network. This approach provides more robust accuracy even when pruned from the pretrained model with low accuracy. Extensive experimental results show our method decreases 56.6% and 84.6% of FLOPs and parameters of VGG16 with negligible loss of accuracy on CIFAR100, which is the state-of-the art performance. Furthermore, our method presents outperforming or comparable pruning results against state-of-the-art models on multiple datasets.

Original languageEnglish
Pages (from-to)189587-189597
Number of pages11
JournalIEEE Access
Volume8
DOIs
StatePublished - 2020

Bibliographical note

Publisher Copyright:
© 2020 Institute of Electrical and Electronics Engineers Inc.. All rights reserved.

Keywords

  • Filter pruning
  • Filter re-initialization
  • Latent space clustering

Fingerprint

Dive into the research topics of 'Filter pruning and re-initialization via latent space clustering'. Together they form a unique fingerprint.

Cite this