D Zmitrovich, A Abramov, A Kalmykov… - arXiv preprint arXiv …, 2023 - arxiv.org
Transformer language models (LMs) are fundamental to NLP research methodologies and applications in various languages. However, developing such models specifically for the …
Semantic relatedness between words is a core concept in natural language processing. While countless approaches have been proposed, measuring which one works best is still a …
S Asaadi, S Mohammad… - Proceedings of the 2019 …, 2019 - aclanthology.org
Bigrams (two-word sequences) hold a special place in semantic composition research since they are the smallest unit formed by composing words. A semantic relatedness dataset that …
The paper describes the results of the first shared task on word sense induction (WSI) for the Russian language. While similar shared tasks were conducted in the past for some …
This paper introduces a novel collection of word embeddings, numerical representations of lexical semantics, in 55 languages, trained on a large corpus of pseudo-conversational …
This paper presents a new graph-based approach that induces synsets using synonymy dictionaries and word embeddings. First, we build a weighted graph of synonyms extracted …
A traditional view on sentence comprehension holds that the listener parses linguistic input using hierarchical syntactic rules. Recently, physiological evidence for such a claim has …
Y Kim, KM Kim, JM Lee, SK Lee - Proceedings of the 27th …, 2018 - aclanthology.org
Distributed representations of words play a major role in the field of natural language processing by encoding semantic and syntactic information of words. However, most …
This paper describes the results of the first shared task on taxonomy enrichment for the Russian language. The participants were asked to extend an existing taxonomy with …