The ability to generalize well is one of the primary desiderata for models of natural language processing (NLP), but what 'good generalization'entails and how it should be evaluated is …
The ability to generalise well is one of the primary desiderata of natural language processing (NLP). Yet, what'good generalisation'entails and how it should be evaluated is …
Obtaining human-like performance in NLP is often argued to require compositional generalisation. Whether neural networks exhibit this ability is usually studied by training …
Unlike literal expressions, idioms' meanings do not directly follow from their parts, posing a challenge for neural machine translation (NMT). NMT models are often unable to translate …
Y Kim - Advances in Neural Information Processing …, 2021 - proceedings.neurips.cc
Sequence-to-sequence learning with neural networks has become the de facto standard for sequence modeling. This approach typically models the local distribution over the next …
Despite progress across a broad range of applications, Transformers have limited success in systematic generalization. The situation is especially frustrating in the case of algorithmic …
Training data memorization in NLP can both be beneficial (eg, closed-book QA) and undesirable (personal data extraction). In any case, successful model training requires a …
NLP models have progressed drastically in recent years, according to numerous datasets proposed to evaluate performance. Questions remain, however, about how particular …
Y Yin, Y Li, F Meng, J Zhou, Y Zhang - arXiv preprint arXiv:2210.06709, 2022 - arxiv.org
Modern neural machine translation (NMT) models have achieved competitive performance in standard benchmarks. However, they have recently been shown to suffer limitation in …