Y Belinkov, J Glass - … of the Association for Computational Linguistics, 2019 - direct.mit.edu
The field of natural language processing has seen impressive progress in recent years, with neural network models replacing many of the traditional systems. A plethora of new models …
Extensive evaluation on a large number of word embedding models for language processing applications is conducted in this work. First, we introduce popular word …
Cross-lingual representations of words enable us to reason about word meaning in multilingual contexts and are a key facilitator of cross-lingual transfer when developing …
R Bommasani, K Davis, C Cardie - … of the 58th Annual Meeting of …, 2020 - aclanthology.org
Contextualized representations (eg ELMo, BERT) have become the default pretrained representations for downstream NLP applications. In some settings, this transition has …
Unsupervised machine translation---ie, not assuming any cross-lingual supervision signal, whether a dictionary, translations, or comparable corpora---seems impossible, but …
Real-valued word representations have transformed NLP applications; popular examples are word2vec and GloVe, recognized for their ability to capture linguistic regularities. In this …
Similarity measures are a vital tool for understanding how language models represent and process language. Standard representational similarity measures such as cosine similarity …
A Bakarov - arXiv preprint arXiv:1801.09536, 2018 - arxiv.org
Word embeddings are real-valued word representations able to capture lexical semantics and trained on natural language corpora. Models proposing these representations have …
A Lenci - Annual review of Linguistics, 2018 - annualreviews.org
Distributional semantics is a usage-based model of meaning, based on the assumption that the statistical distribution of linguistic items in context plays a key role in characterizing their …