关注
Xinyu Zhu
Xinyu Zhu
在 mails.tsinghua.edu.cn 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
Fengshenbang 1.0: Being the foundation of chinese cognitive intelligence
J Wang, Y Zhang, L Zhang, P Yang, X Gao, Z Wu, X Dong, J He, J Zhuo, ...
arXiv preprint arXiv:2209.02970, 2022
912022
Solving Math Word Problem via Cooperative Reasoning induced Language Models
X Zhu, J Wang, L Zhang, Y Zhang, R Gan, J Zhang, Y Yang
ACL 2023 Main Conference, 2022
412022
Zero-Shot Learners for Natural Language Understanding via a Unified Multiple Choice Perspective
P Yang, J Wang, R Gan, X Zhu, L Zhang, Z Wu, X Gao, J Zhang, T Sakai
EMNLP 2022 Main Conference, 2022
162022
AutoConv: Automatically Generating Information-seeking Conversations with Large Language Models
S Li, C Yang, Y Yin, X Zhu, Z Cheng, L Shang, X Jiang, Q Liu, Y Yang
ACL 2023, 2023
72023
Question Answering as Programming for Solving Time-Sensitive Questions
X Zhu, C Yang, B Chen, S Li, JG Lou, Y Yang
EMNLP 2023 Main Conference, 2023
42023
Acronym Extraction with Hybrid Strategies
S Li, C Yang, T Liang, X Zhu, C Yu, Y Yang
AAAI2022 Workshop, 2022
22022
NER-to-MRC: Named-Entity Recognition Completely Solving as Machine Reading Comprehension
Y Zhang, J Wang, X Zhu, T Sakai, H Yamana
arXiv preprint arXiv:2305.03970, 2023
12023
Multilingual Acronym Disambiguation with Multi-choice Classification
X Zhu, C Yu, S Li, T Liang, C Yang, Y Yang
SDU@ AAAI-22, 2022
12022
HoLLMwood: Unleashing the Creativity of Large Language Models in Screenwriting via Role Playing
J Chen*, X Zhu*, C Yang, C Shi, Y Xi, Y Zhang, J Wang, J Pu, R Zhang, ...
arXiv preprint arXiv:2406.11683, 2024
2024
Unchosen Experts Can Contribute Too: Unleashing MoE Models' Power by Self-Contrast
C Shi*, C Yang*, X Zhu*, J Wang*, T Wu, S Li, D Cai, Y Yang, Y Meng
arXiv preprint arXiv:2405.14507, 2024
2024
SSR: Solving Named Entity Recognition Problems via a Single-stream Reasoner
Y Zhang, J Wang, X Zhu, T Sakai, H Yamana
ACM Transactions on Information Systems 42 (5), 1-28, 2024
2024
系统目前无法执行此操作,请稍后再试。
文章 1–11