关注
Xin Jiang
Xin Jiang
Noah’s Ark Lab, Huawei Technologies
在 huawei.com 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
Tinybert: Distilling bert for natural language understanding
X Jiao, Y Yin, L Shang, X Jiang, X Chen, L Li, F Wang, Q Liu
arXiv preprint arXiv:1909.10351, 2019
17152019
ERNIE: Enhanced Language Representation with Informative Entities
Z Zhang, X Han, Z Liu, X Jiang, M Sun, Q Liu
arXiv preprint arXiv:1905.07129, 2019
15892019
FILIP: Fine-grained Interactive Language-Image Pre-Training
L Yao, R Huang, L Hou, G Lu, M Niu, H Xu, X Liang, Z Li, X Jiang, C Xu
arXiv preprint arXiv:2111.07783, 2021
4522021
Dynabert: Dynamic bert with adaptive width and depth
L Hou, Z Huang, L Shang, X Jiang, X Chen, Q Liu
Advances in Neural Information Processing Systems 33, 2020
2732020
Paraphrase generation with deep reinforcement learning
Z Li, X Jiang, L Shang, H Li
arXiv preprint arXiv:1711.00279, 2017
2592017
Neural generative question answering
J Yin, X Jiang, Z Lu, L Shang, H Li, X Li
arXiv preprint arXiv:1512.01337, 2015
2542015
A ranking approach to keyphrase extraction
X Jiang, Y Hu, H Li
Proceedings of the 32nd international ACM SIGIR conference on Research and …, 2009
2072009
BinaryBERT: Pushing the Limit of BERT Quantization
H Bai, W Zhang, L Hou, L Shang, J Jin, X Jiang, Q Liu, M Lyu, I King
arXiv preprint arXiv:2012.15701, 2020
1942020
PanGu-: Large-scale Autoregressive Pretrained Chinese Language Models with Auto-parallel Computation
W Zeng, X Ren, T Su, H Wang, Y Liao, Z Wang, X Jiang, ZZ Yang, K Wang, ...
arXiv preprint arXiv:2104.12369, 2021
1912021
Affective neural response generation
N Asghar, P Poupart, J Hoey, X Jiang, L Mou
European Conference on Information Retrieval, 154-166, 2018
1882018
TernaryBERT: Distillation-aware Ultra-low Bit BERT
W Zhang, L Hou, Y Yin, L Shang, X Chen, X Jiang, Q Liu
arXiv preprint arXiv:2009.12812, 2020
1802020
Aligning Large Language Models with Human: A Survey
Y Wang, W Zhong, L Li, F Mi, X Zeng, W Huang, L Shang, X Jiang, Q Liu
arXiv preprint arXiv:2307.12966, 2023
1582023
Integrating Graph Contextualized Knowledge into Pre-trained Language Models
B He, D Zhou, J Xiao, Q Liu, NJ Yuan, T Xu
arXiv preprint arXiv:1912.00147, 2019
1492019
On position embeddings in bert
B Wang, L Shang, C Lioma, X Jiang, H Yang, Q Liu, JG Simonsen
International Conference on Learning Representations, 2020
1332020
NEZHA: Neural Contextualized Representation for Chinese Language Understanding
J Wei, X Ren, X Li, W Huang, Y Liao, Y Wang, J Lin, X Jiang, X Chen, ...
arXiv preprint arXiv:1909.00204, 2019
1212019
SYNCOBERT: Syntax-Guided Multi-Modal Contrastive Pre-Training for Code Representation
X Wang, FM Yasheng Wang, P Zhou, Y Wan, X Liu, L Li, H Wu, J Liu, ...
1012021
Decomposable Neural Paraphrase Generation
Z Li, X Jiang, L Shang, Q Liu
arXiv preprint arXiv:1906.09741, 2019
982019
Generate & Rank: A Multi-task Framework for Math Word Problems
J Shen, Y Yin, L Li, L Shang, X Jiang, M Zhang, Q Liu
arXiv preprint arXiv:2109.03034, 2021
952021
Comparison of MISR aerosol optical thickness with AERONET measurements in Beijing metropolitan area
X Jiang, Y Liu, B Yu, M Jiang
Remote Sensing of Environment 107 (1-2), 45-53, 2007
882007
Enabling Multimodal Generation on CLIP via Vision-Language Knowledge Distillation
W Dai, L Hou, L Shang, X Jiang, Q Liu, P Fung
arXiv preprint arXiv:2203.06386, 2022
782022
系统目前无法执行此操作,请稍后再试。
文章 1–20