Mope-clip: Structured pruning for efficient vision-language models with module-wise pruning error metric

H Lin, H Bai, Z Liu, L Hou, M Sun… - Proceedings of the …, 2024 - openaccess.thecvf.com
Vision-language pre-trained models have achieved impressive performance on various
downstream tasks. However their large model sizes hinder their utilization on platforms with …

Reaf: Remembering enhancement and entropy-based asymptotic forgetting for filter pruning

X Zhang, W Xie, Y Li, K Jiang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Neurologically, filter pruning is a procedure of forgetting and remembering recovering.
Prevailing methods directly forget less important information from an unrobust baseline at …

Co-compression via superior gene for remote sensing scene classification

W Xie, X Fan, X Zhang, Y Li, M Sheng… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Convolutional neural networks (CNNs) have been successfully employed in remote sensing
image classification because of their robust feature representation for different visual tasks …

Interpretability in machine learning: on the interplay with explainability, predictive performances and models

B Leblanc, P Germain - arXiv preprint arXiv:2311.11491, 2023 - arxiv.org
Interpretability has recently gained attention in the field of machine learning, for it is crucial
when it comes to high-stakes decisions or troubleshooting. This abstract concept is hard to …

Equipocket: an e (3)-equivariant geometric graph neural network for ligand binding site prediction

Y Zhang, Z Wei, Y Yuan, C Li, W Huang - arXiv preprint arXiv:2302.12177, 2023 - arxiv.org
Predicting the binding sites of target proteins plays a fundamental role in drug discovery.
Most existing deep-learning methods consider a protein as a 3D image by spatially …

Filter Pruning for Efficient CNNs via Knowledge-driven Differential Filter Sampler

S Lin, W Huang, J Xie, B Zhang, Y Shen, Z Yu… - arXiv preprint arXiv …, 2023 - arxiv.org
Filter pruning simultaneously accelerates the computation and reduces the memory
overhead of CNNs, which can be effectively applied to edge devices and cloud services. In …

SASRNet: Slimming-Assisted Deep Residual Network for Image Steganalysis

S Huang, M Zhang, Y Ke, F Di, Y Kong - Computing and Informatics, 2024 - cai.sk
Existing deep-learning-based image steganalysis networks have problems such as large
model sizes, significant runtime memory usage, and extensive computational operations …

LayerCollapse: Adaptive compression of neural networks

SZ Shabgahi, MS Shariff, F Koushanfar - arXiv preprint arXiv:2311.17943, 2023 - arxiv.org
Handling the ever-increasing scale of contemporary deep learning and transformer-based
models poses a significant challenge. Overparameterized Transformer networks outperform …

Tailored Channel Pruning: Achieve Targeted Model Complexity through Adaptive Sparsity Regularization

S Lee, Y Jeon, S Lee, J Kim - IEEE Access, 2025 - ieeexplore.ieee.org
In deep learning, the size and complexity of neural networks have rapidly increased to
achieve higher performance. However, this poses a challenge when utilized in resource …

Cloud–Edge Collaborative Inference with Network Pruning

M Li, X Zhang, J Guo, F Li - Electronics, 2023 - mdpi.com
With the increase in model parameters, deep neural networks (DNNs) have achieved
remarkable performance in computer vision, but larger DNNs create a bottleneck for …