Accurate neural network pruning requires rethinking sparse optimization

D Kuznedelev, E Kurtic, E Iofinova, E Frantar… - arXiv preprint arXiv …, 2023 - arxiv.org
Obtaining versions of deep neural networks that are both highly-accurate and highly-sparse
is one of the main challenges in the area of model compression, and several high …

RCV2023 Challenges: Benchmarking Model Training and Inference for Resource-Constrained Deep Learning

R Tiwari, A Chavan, D Gupta, G Mago… - Proceedings of the …, 2023 - openaccess.thecvf.com
This paper delves into the results of two resource-constrained deep learning challenges,
part of the workshop on Resource-Efficient Deep Learning for Computer Vision (RCV) at …

Cap: Correlation-aware pruning for highly-accurate sparse vision models

D Kuznedelev, E Kurtić, E Frantar… - Advances in Neural …, 2024 - proceedings.neurips.cc
Driven by significant improvements in architectural design and training pipelines, computer
visionhas recently experienced dramatic progress in terms of accuracy on classic …

Effective Neural Network Regularization With BinMask

K Jia, M Rinard - arXiv preprint arXiv:2304.11237, 2023 - arxiv.org
$ L_0 $ regularization of neural networks is a fundamental problem. In addition to
regularizing models for better generalizability, $ L_0 $ regularization also applies to …

AdapMTL: Adaptive Pruning Framework for Multitask Learning Model

M Xiang, SJ Tang, Q Yang, H Guan, T Liu - arXiv preprint arXiv …, 2024 - arxiv.org
In the domain of multimedia and multimodal processing, the efficient handling of diverse
data streams such as images, video, and sensor data is paramount. Model compression and …

Edge Detectors Can Make Deep Convolutional Neural Networks More Robust

J Ding, JC Zhao, YZ Sun, P Tan, JW Wang… - arXiv preprint arXiv …, 2024 - arxiv.org
Deep convolutional neural networks (DCNN for short) are vulnerable to examples with small
perturbations. Improving DCNN's robustness is of great significance to the safety-critical …

Feather: An Elegant Solution to Effective DNN Sparsification

AG Georgoulakis, G Retsinas, P Maragos - arXiv preprint arXiv …, 2023 - arxiv.org
Neural Network pruning is an increasingly popular way for producing compact and efficient
models, suitable for resource-limited environments, while preserving high performance …

SequentialAttention++ for Block Sparsification: Differentiable Pruning Meets Combinatorial Optimization

T Yasuda, K Axiotis, G Fu, MH Bateni… - arXiv preprint arXiv …, 2024 - arxiv.org
Neural network pruning is a key technique towards engineering large yet scalable,
interpretable, and generalizable models. Prior work on the subject has developed largely …

Efficiency and generalization of sparse neural networks

EA Peste - 2023 - research-explorer.ista.ac.at
Deep learning has become an integral part of a large number of important applications, and
many of the recent breakthroughs have been enabled by the ability to train very large …

[PDF][PDF] Feather: An Elegant Solution to Effective DNN Sparsification-Supplementary Material

AG Georgoulakis, G Retsinas, P Maragos… - 2023 - bmvc2022.mpi-inf.mpg.de
Table 1 summarizes the training hyperparameters used for our experiments on CIFAR-100
[3] and ImageNet [6] datasets. The chosen hyperparameters are selected based on standard …