Making sense of word embeddings

M Pelevina, N Arefyev, C Biemann… - arXiv preprint arXiv …, 2017 - arxiv.org
… either directly learn sense representations from corpora or rely on sense inventories from …
sense inventory from existing word embeddings via clustering of ego-networks of related words

From word to sense embeddings: A survey on vector representations of meaning

J Camacho-Collados, MT Pilehvar - Journal of Artificial Intelligence …, 2018 - jair.org
… improvements upon integrating word embeddings, including … properties of words, the
effectiveness of word embeddings is … One representation per word - does it make sense for …

Does BERT make any sense? Interpretable word sense disambiguation with contextualized embeddings

G Wiedemann, S Remus, A Chawla… - arXiv preprint arXiv …, 2019 - arxiv.org
… In this paper, we tested the semantic properties of contextualized word embeddings (CWEs)
to address word sense disambiguation.To test their capabilities to distinguish different …

Competing Views of Word Meaning: Word Embeddings and Word Senses

G Grefenstette, P Hanks - International Journal of Lexicography, 2023 - academic.oup.com
… Once created, these word vectors, called word embeddings since each word is embedded
by the learning process in an n-dimensional space, are used as a replacement for the words

How much does a word weigh? Weighting word embeddings for word sense induction

N Arefyev, P Ermolaev, A Panchenko - arXiv preprint arXiv:1805.09209, 2018 - arxiv.org
word embeddings of size 200 trained for 3 epochs on Librusec and containing only … word
embedding models, clustering algorithms and word weighting methods for the context words

Multi-sense embeddings through a word sense disambiguation process

T Ruas, W Grosky, A Aizawa - Expert Systems with Applications, 2019 - Elsevier
… -network clustering, and aggregates word vectors with their possible word-sense vectors.
In contrast with these approaches, we use only single vector word embeddings to support our …

Language modelling makes sense: Propagating representations through WordNet for full-coverage word sense disambiguation

D Loureiro, A Jorge - arXiv preprint arXiv:1906.10007, 2019 - arxiv.org
… We use fastText to generate static word embeddings for the lemmas ( vl… word embeddings
to our previous embeddings. When making predictions, we also compute fastText embeddings

Best of both worlds: Making word sense embeddings interpretable

A Panchenko - Proceedings of the Tenth International Conference …, 2016 - aclanthology.org
… links word sense embeddings to a lexical resource, making … While most word embedding
approaches represent a term … approaches to produce word sense embeddings from corpora (…

[HTML][HTML] Word sense induction using word embeddings and community detection in complex networks

EA Corrêa Jr, DR Amancio - Physica A: Statistical Mechanics and its …, 2019 - Elsevier
… senses, existing systems are still very limited in the sense that they make use of structured, …
in word embeddings research to generate context embeddings, which are embeddings

Embeddings for word sense disambiguation: An evaluation study

IJ Iacobacci, MT Pilehvar, R Navigli - 54th Annual Meeting of the …, 2016 - iris.uniroma1.it
… we study how word embeddings can be used in Word Sense … methods through which word
embeddings can be leveraged … a WSD system that makes use of word embeddings alone, if …