关注
Noah Constant
Noah Constant
Google DeepMind
在 google.com 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
The Power of Scale for Parameter-Efficient Prompt Tuning
B Lester, R Al-Rfou, N Constant
arXiv preprint arXiv:2104.08691, 2021
28152021
Universal Sentence Encoder
D Cer, Y Yang, S Kong, N Hua, N Limtiaco, RS John, N Constant, ...
arXiv preprint arXiv:1803.11175, 2018
2714*2018
mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer
L Xue, N Constant, A Roberts, M Kale, R Al-Rfou, A Siddhant, A Barua, ...
arXiv preprint arXiv:2010.11934, 2020
19842020
Beyond the Imitation Game: Quantifying and extrapolating the capabilities of language models
A Srivastava, A Rastogi, A Rao, AAM Shoeb, A Abid, A Fisch, AR Brown, ...
arXiv preprint arXiv:2206.04615, 2022
8692022
Multilingual Universal Sentence Encoder for Semantic Retrieval
Y Yang, D Cer, A Ahmad, M Guo, J Law, N Constant, GH Abrego, S Yuan, ...
arXiv preprint arXiv:1907.04307, 2019
5112019
Character-Level Language Modeling with Deeper Self-Attention
R Al-Rfou, D Choe, N Constant, M Guo, L Jones
Proceedings of the AAAI conference on artificial intelligence 33 (01), 3159-3166, 2019
4362019
ByT5: Towards a Token-Free Future with Pre-trained Byte-to-Byte Models
L Xue, A Barua, N Constant, R Al-Rfou, S Narang, M Kale, A Roberts, ...
Transactions of the Association for Computational Linguistics 10, 291-306, 2022
3452022
Sentence-T5: Scalable Sentence Encoders from Pre-trained Text-to-Text Models
J Ni, GH Ábrego, N Constant, J Ma, KB Hall, D Cer, Y Yang
arXiv preprint arXiv:2108.08877, 2021
3242021
SPoT: Better Frozen Model Adaptation through Soft Prompt Transfer
T Vu, B Lester, N Constant, R Al-Rfou, D Cer
arXiv preprint arXiv:2110.07904, 2021
2242021
Contrastive Topic: Meanings and Realizations
N Constant
2092014
Learning Semantic Textual Similarity from Conversations
Y Yang, S Yuan, D Cer, S Kong, N Constant, P Pilar, H Ge, YH Sung, ...
arXiv preprint arXiv:1804.07754, 2018
1792018
XTREME-R: Towards More Challenging and Nuanced Multilingual Evaluation
S Ruder, N Constant, J Botha, A Siddhant, O Firat, J Fu, P Liu, J Hu, ...
arXiv preprint arXiv:2104.07412, 2021
1302021
Effective Parallel Corpus Mining using Bilingual Sentence Embeddings
M Guo, Q Shen, Y Yang, H Ge, D Cer, GH Abrego, K Stevens, N Constant, ...
arXiv preprint arXiv:1807.11906, 2018
1192018
English rise-fall-rise: A study in the semantics and pragmatics of intonation
N Constant
Linguistics & Philosophy 35 (5), 407-442, 2012
1092012
FreshLLMs: Refreshing Large Language Models with Search Engine Augmentation
T Vu, M Iyyer, X Wang, N Constant, J Wei, J Wei, C Tar, YH Sung, D Zhou, ...
arXiv preprint arXiv:2310.03214, 2023
682023
The pragmatics of expressive content: Evidence from large corpora
N Constant, C Davis, C Potts, F Schwarz
Sprache und Datenverarbeitung 33 (1-2), 5-21, 2009
682009
ReQA: An Evaluation for End-to-End Answer Retrieval Models
A Ahmad, N Constant, Y Yang, D Cer
arXiv preprint arXiv:1907.04780, 2019
552019
TextSETTR: Few-Shot Text Style Extraction and Tunable Targeted Restyling
P Riley, N Constant, M Guo, G Kumar, D Uthus, Z Parekh
arXiv preprint arXiv:2010.03802, 2020
53*2020
LAReQA: Language-agnostic answer retrieval from a multilingual pool
U Roy, N Constant, R Al-Rfou, A Barua, A Phillips, Y Yang
arXiv preprint arXiv:2004.05484, 2020
532020
Overcoming Catastrophic Forgetting in Zero-Shot Cross-Lingual Generation
T Vu, A Barua, B Lester, D Cer, M Iyyer, N Constant
arXiv preprint arXiv:2205.12647, 2022
472022
系统目前无法执行此操作,请稍后再试。
文章 1–20