T Schick, H Schütze - Proceedings of the AAAI Conference on Artificial …, 2020 - ojs.aaai.org
Pretraining deep neural network architectures with a language modeling objective has brought large improvements for many natural language processing tasks. Exemplified by …
M Garcia, TK Vieira, C Scarton… - Proceedings of the …, 2021 - eprints.whiterose.ac.uk
Contextualised word representation models have been successfully used for capturing different word usages and they may be an attractive alternative for representing idiomaticity …
Despite their success in a variety of NLP tasks, pre-trained language models, due to their heavy reliance on compositionality, fail in effectively capturing the meanings of multiword …
D Biś, M Podkorytov, X Liu - … of the 2021 conference of the North …, 2021 - aclanthology.org
The success of language models based on the Transformer architecture appears to be inconsistent with observed anisotropic properties of representations learned by such …
State-of-the-art NLP systems represent inputs with word embeddings, but these are brittle when faced with Out-of-Vocabulary (OOV) words. To address this issue, we follow the …
M Garcia, T Kramer Vieira, C Scarton… - Proceedings of ACL …, 2021 - eprints.whiterose.ac.uk
Accurate assessment of the ability of embedding models to capture idiomaticity may require evaluation at token rather than type level, to account for degrees of idiomaticity and possible …
Z Liang, Y Lu, HG Chen, Y Rao - … of the 61st Annual Meeting of …, 2023 - aclanthology.org
The out-of-vocabulary (OOV) words are difficult to represent while critical to the performance of embedding-based downstream models. Prior OOV word embedding learning methods …
We present Variational Bayesian Network (VBN)-a novel Bayesian entity representation learning model that utilizes hierarchical and relational side information and is particularly …
Word embeddings are powerful dictionaries, which may easily capture language variations. However, these dictionaries fail to give sense to rare words, which are surprisingly often …