G Xu, Q Hu - arXiv preprint arXiv:2201.08542, 2022 - arxiv.org
Model compression techniques are receiving increasing attention; however, the effect of compression on model fairness is still under explored. This is the first paper to examine the …
With the growth of model and data sizes, a broad effort has been made to design pruning techniques that reduce the resource demand of deep learning pipelines, while retaining …
S Pavlitska, H Grolig, JM Zollner - 2023 IEEE Symposium …, 2023 - ieeexplore.ieee.org
Increasing the model capacity is a known approach to enhance the adversarial robustness of deep learning networks. On the other hand, various model compression techniques …
Abstract Convolutional Neural Networks (CNNs) have achieved state-of-the-art performance in many computer vision tasks. However high computational and storage demands hinder …
A Good, J Lin, X Yu, H Sieg… - Advances in …, 2022 - proceedings.neurips.cc
Pruning techniques have been successfully used in neural networks to trade accuracy for sparsity. However, the impact of network pruning is not uniform: prior work has shown that …
With the rapid development of deep learning, the sizes of deep neural networks are getting larger beyond the affordability of hardware platforms. Given the fact that neural networks are …
H Luo, Z Zhuang, Y Li, M Tan, C Chen… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Network pruning has been widely studied to reduce the complexity of deep neural networks (DNNs) and hence speed up their inference. Unfortunately, most existing pruning methods …
Z Jin, Z Zhu, H Hu, M Xue, H Chen - Proceedings of the 2023 ACM Asia …, 2023 - dl.acm.org
Machine learning models have made significant breakthroughs across various domains. However, it is crucial to assess these models to obtain a complete understanding of their …
Neural networks are a powerful class of non-linear functions. However, their black-box nature makes it difficult to explain their behaviour and certify their safety. Abstraction …