Breaking through the 80% glass ceiling: Raising the state of the art in word sense disambiguation by incorporating knowledge graph information

M Bevilacqua, R Navigli - Proceedings of the conference …, 2020 - iris.uniroma1.it
Neural architectures are the current state of the art in Word Sense Disambiguation (WSD).
However, they make limited use of the vast amount of relational information encoded in …

[HTML][HTML] ChatGPT: Jack of all trades, master of none

J Kocoń, I Cichecki, O Kaszyca, M Kochanek, D Szydło… - Information …, 2023 - Elsevier
OpenAI has released the Chat Generative Pre-trained Transformer (ChatGPT) and
revolutionized the approach in artificial intelligence to human-model interaction. The first …

[PDF][PDF] Recent trends in word sense disambiguation: A survey

M Bevilacqua, T Pasini… - … Joint Conference on …, 2021 - researchportal.helsinki.fi
Abstract Word Sense Disambiguation (WSD) aims at making explicit the semantics of a word
in context by identifying the most suitable meaning from a predefined sense inventory …

An overview of word and sense similarity

R Navigli, F Martelli - Natural Language Engineering, 2019 - cambridge.org
Over the last two decades, determining the similarity between words as well as between
their meanings, that is, word senses, has been proven to be of vital importance in the field of …

A survey on semantic processing techniques

R Mao, K He, X Zhang, G Chen, J Ni, Z Yang… - Information …, 2024 - Elsevier
Semantic processing is a fundamental research domain in computational linguistics. In the
era of powerful pre-trained language models and large language models, the advancement …

SenseBERT: Driving some sense into BERT

Y Levine, B Lenz, O Dagan, O Ram, D Padnos… - arXiv preprint arXiv …, 2019 - arxiv.org
The ability to learn from large unlabeled corpora has allowed neural language models to
advance the frontier in natural language understanding. However, existing self-supervision …

Does BERT make any sense? Interpretable word sense disambiguation with contextualized embeddings

G Wiedemann, S Remus, A Chawla… - arXiv preprint arXiv …, 2019 - arxiv.org
Contextualized word embeddings (CWE) such as provided by ELMo (Peters et al., 2018),
Flair NLP (Akbik et al., 2018), or BERT (Devlin et al., 2019) are a major recent innovation in …

Word sense disambiguation: a uinified evaluation framework and empirical comparison

A Raganato, J Camacho-Collados… - Proceedings of the 15th …, 2017 - iris.uniroma1.it
Abstract Word Sense Disambiguation is a longstanding task in Natural Language
Processing, lying at the core of human language understanding. However, the evaluation of …

Moving down the long tail of word sense disambiguation with gloss-informed biencoders

T Blevins, L Zettlemoyer - arXiv preprint arXiv:2005.02590, 2020 - arxiv.org
A major obstacle in Word Sense Disambiguation (WSD) is that word senses are not
uniformly distributed, causing existing models to generally perform poorly on senses that are …

With more contexts comes better performance: Contextualized sense embeddings for all-round word sense disambiguation

B Scarlini, T Pasini, R Navigli - Proceedings of the 2020 …, 2020 - iris.uniroma1.it
Contextualized word embeddings have been employed effectively across several tasks in
Natural Language Processing, as they have proved to carry useful semantic information …