MNN: A universal and efficient inference engine X Jiang, H Wang, Y Chen, Z Wu, L Wang, B Zou, Y Yang, Z Cui, Y Cai, ... MLSys'20, Code: https://github.com/alibaba/MNN, 2020 | 156 | 2020 |
Neural pruning via growing regularization H Wang, C Qin, Y Zhang, Y Fu ICLR'21, Code: https://github.com/MingSun-Tse/Regularization-Pruning, 2021 | 130 | 2021 |
Collaborative distillation for ultra-resolution universal style transfer H Wang, Y Li, Y Wang, H Hu, MH Yang CVPR'20, Code: https://github.com/MingSun-Tse/Collaborative-Distillation, 2020 | 108 | 2020 |
Recent Advances on Neural Network Pruning at Initialization H Wang, C Qin, Y Bai, Y Zhang, Y Fu IJCAI'22, 2022 | 95* | 2022 |
Structured probabilistic pruning for deep convolutional neural network acceleration H Wang, Q Zhang, Y Wang, H Hu BMVC'18 (Oral), Code: https://github.com/MingSun-Tse/Caffe_IncReg, 2018 | 84* | 2018 |
Context reasoning attention network for image super-resolution Y Zhang, D Wei, C Qin, H Wang, H Pfister, Y Fu ICCV'21, 4278-4287, 2021 | 73 | 2021 |
Structured Pruning for Efficient ConvNets via Incremental Regularization H Wang, Q Zhang, Y Wang, L Yu, H Hu NeurIPS Workshop'18, IJCNN'19 (Oral), Code: https://github.com/MingSun-Tse …, 2019 | 67* | 2019 |
R2L: Distilling Neural Radiance Field to Neural Light Field for Efficient Novel View Synthesis H Wang, J Ren, Z Huang, K Olszewski, M Chai, Y Fu, S Tulyakov ECCV'22, Project: https://snap-research.github.io/R2L, 2022 | 59 | 2022 |
SnapFusion: Text-to-Image Diffusion Model on Mobile Devices within Two Seconds Y Li*, H Wang*, Q Jin*, J Hu, P Chemerys, Y Fu, Y Wang, S Tulyakov, ... NeurIPS'23, Project: https://snap-research.github.io/SnapFusion, 2023 | 58 | 2023 |
Contradictory Structure Learning for Semi-supervised Domain Adaptation C Qin, L Wang, Q Ma, Y Yin, H Wang, Y Fu SIAM International Conference on Data Mining (SDM), 576-584, 2021 | 55 | 2021 |
Triplet distillation for deep face recognition Y Feng, H Wang, R Hu, DT Yi ICML'19 Workshop, 2019 | 51 | 2019 |
Aligned Structured Sparsity Learning for Efficient Image Super-Resolution H Wang*, Y Zhang*, C Qin, Y Fu NeurIPS'21 (Spotlight), Code: https://github.com/MingSun-Tse/ASSL, 2021 | 49 | 2021 |
Image as Set of Points X Ma, Y Zhou, H Wang, C Qin, B Sun, C Liu, Y Fu ICLR'23 (Oral, top 5%), Code: https://github.com/ma-xu/Context-Cluster, 2023 | 43 | 2023 |
What Makes a "Good" Data Augmentation in Knowledge Distillation -- A Statistical Perspective H Wang, S Lohit, M Jones, Y Fu NeurIPS'22, Code: https://github.com/MingSun-Tse/Good-DA-in-KD, 2022 | 41* | 2022 |
Learning Efficient Image Super-Resolution Networks via Structure-Regularized Pruning H Wang*, Y Zhang*, C Qin, Y Fu ICLR'22, Code: https://github.com/MingSun-Tse/SRP, 2022 | 41 | 2022 |
Structured pruning for efficient convolutional neural networks via incremental regularization H Wang, X Hu, Q Zhang, Y Wang, L Yu, H Hu IEEE Journal of Selected Topics in Signal Processing 14 (4), 775-788, 2019 | 34 | 2019 |
Dual Lottery Ticket Hypothesis Y Bai, H Wang, Z Tao, K Li, Y Fu ICLR'22, Code: https://github.com/yueb17/DLTH, 2022 | 33 | 2022 |
Real-Time Neural Light Field on Mobile Devices J Cao, H Wang, P Chemerys, V Shakhrai, J Hu, Y Fu, D Makoviichuk, ... CVPR'23, Project: https://snap-research.github.io/MobileR2L/, 2023 | 32 | 2023 |
Trainability preserving neural pruning H Wang, Y Fu ICLR'23, Code: https://github.com/MingSun-Tse/TPP, 2023 | 28 | 2023 |
Why is the State of Neural Network Pruning so Confusing? On the Fairness, Comparison Setup, and Trainability in Network Pruning H Wang, C Qin, Y Bai, Y Fu arXiv preprint arXiv:2301.05219, 2023 | 17 | 2023 |