Rptq: Reorder-based post-training quantization for large language models Z Yuan, L Niu, J Liu, W Liu, X Wang, Y Shang, G Sun, Q Wu, J Wu, B Wu arXiv preprint arXiv:2304.01089, 2023 | 48 | 2023 |
Pd-quant: Post-training quantization based on prediction difference metric J Liu, L Niu, Z Yuan, D Yang, X Wang, W Liu Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2023 | 48 | 2023 |
Improving post-training quantization on object detection with task loss-guided lp metric L Niu, J Liu, Z Yuan, D Yang, X Wang, W Liu arXiv preprint arXiv:2304.09785, 2023 | 3 | 2023 |
Benchmarking the reliability of post-training quantization: a particular focus on worst-case performance Z Yuan, J Liu, J Wu, D Yang, Q Wu, G Sun, W Liu, X Wang, B Wu arXiv preprint arXiv:2303.13003, 2023 | 3 | 2023 |
Visual Text Generation in the Wild Y Zhu, J Liu, F Gao, W Liu, X Wang, P Wang, F Huang, C Yao, Z Yang arXiv preprint arXiv:2407.14138, 2024 | | 2024 |
Stabilized activation scale estimation for precise Post-Training Quantization Z Hao, X Wang, J Liu, Z Yuan, D Yang, W Liu Neurocomputing 569, 127120, 2024 | | 2024 |