Three-Dimensional Convolutional Neural Network Pruning with Regularization-Based Method

Yuxin Zhang, Huan Wang, Yang Luo, Lu Yu, Haoji Hu, Hangguan Shan, Tony Q.S. Quek

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

Despite enjoying extensive applications in video analysis, three-dimensional convolutional neural networks (3D CNNs) are restricted by their massive computation and storage consumption. To solve this problem, we propose a three-dimensional regularization-based neural network pruning method to assign different regularization parameters to different weight groups based on their importance to the network. Further we analyze the redundancy and computation cost for each layer to determine the different pruning ratios. Experiments show that pruning based on our method can lead to 2× theoretical speedup with only 0.41% accuracy loss for 3D-ResNet18 and 3.28% accuracy loss for C3D. The proposed method performs favorably against other popular methods for model compression and acceleration.

Original languageEnglish
Title of host publication2019 IEEE International Conference on Image Processing, ICIP 2019 - Proceedings
PublisherIEEE Computer Society
Pages4270-4274
Number of pages5
ISBN (Electronic)9781538662496
DOIs
Publication statusPublished - 2019 Sep
Event26th IEEE International Conference on Image Processing, ICIP 2019 - Taipei, Taiwan
Duration: 2019 Sep 222019 Sep 25

Publication series

NameProceedings - International Conference on Image Processing, ICIP
Volume2019-September
ISSN (Print)1522-4880

Conference

Conference26th IEEE International Conference on Image Processing, ICIP 2019
Country/TerritoryTaiwan
CityTaipei
Period19-09-2219-09-25

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Vision and Pattern Recognition
  • Signal Processing

Fingerprint

Dive into the research topics of 'Three-Dimensional Convolutional Neural Network Pruning with Regularization-Based Method'. Together they form a unique fingerprint.

Cite this