Pruning is a model compression method that removes redundant parameters and accelerates the inference speed of deep neural networks while maintaining accuracy. Most available pruning methods impose various conditions on parameters or features directly. In this talk, we introduce a simple and effective regularization strategy to improve the structured sparsity and structured pruning in DNNs from a new perspective of evolution of features. In particular, we consider the trajectories connecting features of adjacent hidden layers, namely feature flow. We propose feature flow regularization (FFR) to penalize the length and the total absolute curvature of the trajectories, which implicitly increases the structured sparsity of the parameters. The principle behind FFR is that short and straight trajectories will lead to an efficient network that avoids redundant parameters. We provide experiment results on CIFAR-10 and ImageNet datasets which show that FFR improves structured sparsity and achieves pruning results comparable to or even better than those state-of-the-art methods.

4 May 2022
9:00am - 10:00am
Where
https://hkust.zoom.us/j/93243000376 (Passcode: hkust)
Speakers/Performers
Miss Yue WU
HKUST
Organizer(S)
Department of Mathematics
Contact/Enquiries
Payment Details
Audience
Alumni, Faculty and staff, PG students, UG students
Language(s)
English
Other Events
11 May 2026
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Regioselective Pyridine C-H-Functionalization and Skeletal Editing
Abstract Pyridines belong to the most abundant heteroarenes in medicinal chemistry and in agrochemical industry. In the lecture, highly regioselective pyridine C-H functionalization through a d...
20 Jan 2026
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - A Journey to Defect Science and Engineering
Abstract A defect in a material is one of the most important concerns when it comes to modifying and tuning the properties and phenomena of materials. The speaker will review his study of defec...