关注
Zhangchi Zhu
Zhangchi Zhu
在 stu.ecnu.edu.cn 的电子邮件经过验证
标题
引用次数
引用次数
年份
Robust Positive-Unlabeled Learning via Noise Negative Sample Self-correction
Z Zhu, L Wang, P Zhao, C Du, W Zhang, H Dong, B Qiao, Q Lin, ...
Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and …, 2023
62023
Fairly Evaluating Large Language Model-based Recommendation Needs Revisit the Cross-Entropy Loss
C Xu, Z Zhu, J Wang, J Wang, W Zhang
arXiv preprint arXiv:2402.06216, 2024
42024
Preference-Consistent Knowledge Distillation for Recommender System
Z Zhu, W Zhang
IEEE Transactions on Knowledge and Data Engineering, 2025
2025
Exploring Feature-based Knowledge Distillation For Recommender System: A Frequency Perspective
Z Zhu, W Zhang
arXiv preprint arXiv:2411.10676, 2024
2024
Are LLM-based Recommenders Already the Best? Simple Scaled Cross-entropy Unleashes the Potential of Traditional Sequential Recommenders
C Xu, Z Zhu, M Yu, J Wang, J Wang, W Zhang
arXiv preprint arXiv:2408.14238, 2024
2024
From Input to Output: A Multi-layer Knowledge Distillation Framework for Compressing Recommendation Models
Z Zhu, W Zhang
arXiv preprint arXiv:2311.04549, 2023
2023
系统目前无法执行此操作,请稍后再试。
文章 1–6