Pruning is a model compression method that removes redundant parameters and accelerates the inference speed of deep neural networks while maintaining accuracy. Most available pruning methods impose various conditions on parameters or features directly. In this talk, we introduce a simple and effective regularization strategy to improve the structured sparsity and structured pruning in DNNs from a new perspective of evolution of features. In particular, we consider the trajectories connecting features of adjacent hidden layers, namely feature flow. We propose feature flow regularization (FFR) to penalize the length and the total absolute curvature of the trajectories, which implicitly increases the structured sparsity of the parameters. The principle behind FFR is that short and straight trajectories will lead to an efficient network that avoids redundant parameters. We provide experiment results on CIFAR-10 and ImageNet datasets which show that FFR improves structured sparsity and achieves pruning results comparable to or even better than those state-of-the-art methods.

5月4日
9:00am - 10:00am
地点
https://hkust.zoom.us/j/93243000376 (Passcode: hkust)
讲者/表演者
Miss Yue WU
HKUST
主办单位
Department of Mathematics
联系方法
付款详情
对象
Alumni, Faculty and staff, PG students, UG students
语言
英语
其他活动
11月22日
研讨会, 演讲, 讲座
IAS / School of Science Joint Lecture - Leveraging Protein Dynamics Memory with Machine Learning to Advance Drug Design: From Antibiotics to Targeted Protein Degradation
Abstract Protein dynamics are fundamental to protein function and encode complex biomolecular mechanisms. Although Markov state models have made it possible to capture long-timescale protein co...
11月8日
研讨会, 演讲, 讲座
IAS / School of Science Joint Lecture - Some Theorems in the Representation Theory of Classical Lie Groups
Abstract After introducing some basic notions in the representation theory of classical Lie groups, the speaker will explain three results in this theory: the multiplicity one theorem for classical...