Distributional models of word meaning

A Lenci - Annual review of Linguistics, 2018 - annualreviews.org
Distributional semantics is a usage-based model of meaning, based on the assumption that
the statistical distribution of linguistic items in context plays a key role in characterizing their …

Poincaré embeddings for learning hierarchical representations

M Nickel, D Kiela - Advances in neural information …, 2017 - proceedings.neurips.cc
Abstract Representation learning has become an invaluable approach for learning from
symbolic data such as text and graphs. However, state-of-the-art embedding methods …

Poincar\'e glove: Hyperbolic word embeddings

A Tifrea, G Bécigneul, OE Ganea - arXiv preprint arXiv:1810.06546, 2018 - arxiv.org
Words are not created equal. In fact, they form an aristocratic graph with a latent hierarchical
structure that the next generation of unsupervised learned word embeddings should reveal …

Improving hypernymy detection with an integrated path-based and distributional method

V Shwartz, Y Goldberg, I Dagan - arXiv preprint arXiv:1603.06076, 2016 - arxiv.org
Detecting hypernymy relations is a key task in NLP, which is addressed in the literature
using two complementary approaches. Distributional methods, whose supervised variants …

[PDF][PDF] Do supervised distributional methods really learn lexical inference relations?

O Levy, S Remus, C Biemann… - Proceedings of the 2015 …, 2015 - aclanthology.org
Distributional representations of words have been recently used in supervised settings for
recognizing lexical inference relations between word pairs, such as hypernymy and …

Hearst patterns revisited: Automatic hypernym detection from large text corpora

S Roller, D Kiela, M Nickel - arXiv preprint arXiv:1806.03191, 2018 - arxiv.org
Methods for unsupervised hypernym detection may broadly be categorized according to two
paradigms: pattern-based and distributional methods. In this paper, we study the …

On the systematicity of probing contextualized word representations: The case of hypernymy in BERT

A Ravichander, E Hovy, K Suleman… - Proceedings of the …, 2020 - aclanthology.org
Contextualized word representations have become a driving force in NLP, motivating
widespread interest in understanding their capabilities and the mechanisms by which they …

Take and took, gaggle and goose, book and read: Evaluating the utility of vector differences for lexical relation learning

E Vylomova, L Rimell, T Cohn, T Baldwin - arXiv preprint arXiv …, 2015 - arxiv.org
Recent work on word embeddings has shown that simple vector subtraction over pre-trained
embeddings is surprisingly effective at capturing different lexical relations, despite lacking …

[PDF][PDF] Semeval-2016 task 13: Taxonomy extraction evaluation (texeval-2)

G Bordea, E Lefever, P Buitelaar - Proceedings of the 10th …, 2016 - aclanthology.org
This paper describes the second edition of the shared task on Taxonomy Extraction
Evaluation organised as part of SemEval 2016. This task aims to extract hypernym-hyponym …

Specialising word vectors for lexical entailment

I Vulić, N Mrkšić - arXiv preprint arXiv:1710.06371, 2017 - arxiv.org
We present LEAR (Lexical Entailment Attract-Repel), a novel post-processing method that
transforms any input word vector space to emphasise the asymmetric relation of lexical …