Mim4dd: Mutual information maximization for dataset distillation

Y Shang, Z Yuan, Y Yan - Advances in Neural Information …, 2024 - proceedings.neurips.cc
Dataset distillation (DD) aims to synthesize a small dataset whose test performance is
comparable to a full dataset using the same model. State-of-the-art (SoTA) methods optimize …

Efficient Multitask Dense Predictor via Binarization

Y Shang, D Xu, G Liu… - Proceedings of the …, 2024 - openaccess.thecvf.com
Multi-task learning for dense prediction has emerged as a pivotal area in computer vision
enabling simultaneous processing of diverse yet interrelated pixel-wise prediction tasks …

Contemporary advances in neural network quantization: A survey

M Li, Z Huang, L Chen, J Ren, M Jiang… - … Joint Conference on …, 2024 - ieeexplore.ieee.org
In the realm of deep learning, the advent of large-scale pre-trained models has significantly
advanced computer vision and natural language processing. However, deploying these …

Robustness-Guided Image Synthesis for Data-Free Quantization

J Bai, Y Yang, H Chu, H Wang, Z Liu, R Chen… - Proceedings of the …, 2024 - ojs.aaai.org
Quantization has emerged as a promising direction for model compression. Recently, data-
free quantization has been widely studied as a promising method to avoid privacy concerns …