Experience grounds language

Y Bisk, A Holtzman, J Thomason, J Andreas… - arXiv preprint arXiv …, 2020 - arxiv.org
Language understanding research is held back by a failure to relate language to the
physical world it describes and to the social interactions it facilitates. Despite the incredible …

Knowledge representation learning: A quantitative review

Y Lin, X Han, R Xie, Z Liu, M Sun - arXiv preprint arXiv:1812.10901, 2018 - arxiv.org
Knowledge representation learning (KRL) aims to represent entities and relations in
knowledge graph in low-dimensional semantic space, which have been widely used in …

Hyperbolic entailment cones for learning hierarchical embeddings

O Ganea, G Bécigneul… - … conference on machine …, 2018 - proceedings.mlr.press
Learning graph representations via low-dimensional embeddings that preserve relevant
network properties is an important class of problems in machine learning. We here present a …

Improving hypernymy detection with an integrated path-based and distributional method

V Shwartz, Y Goldberg, I Dagan - arXiv preprint arXiv:1603.06076, 2016 - arxiv.org
Detecting hypernymy relations is a key task in NLP, which is addressed in the literature
using two complementary approaches. Distributional methods, whose supervised variants …

[PDF][PDF] Do supervised distributional methods really learn lexical inference relations?

O Levy, S Remus, C Biemann… - Proceedings of the 2015 …, 2015 - aclanthology.org
Distributional representations of words have been recently used in supervised settings for
recognizing lexical inference relations between word pairs, such as hypernymy and …

Neural network-based detection of self-admitted technical debt: From performance to explainability

X Ren, Z Xing, X Xia, D Lo, X Wang… - ACM transactions on …, 2019 - dl.acm.org
Technical debt is a metaphor to reflect the tradeoff software engineers make between short-
term benefits and long-term stability. Self-admitted technical debt (SATD), a variant of …

Take and took, gaggle and goose, book and read: Evaluating the utility of vector differences for lexical relation learning

E Vylomova, L Rimell, T Cohn, T Baldwin - arXiv preprint arXiv …, 2015 - arxiv.org
Recent work on word embeddings has shown that simple vector subtraction over pre-trained
embeddings is surprisingly effective at capturing different lexical relations, despite lacking …

Intent detection using semantically enriched word embeddings

JK Kim, G Tur, A Celikyilmaz, B Cao… - 2016 IEEE spoken …, 2016 - ieeexplore.ieee.org
State-of-the-art targeted language understanding systems rely on deep learning methods
using 1-hot word vectors or off-the-shelf word embeddings. While word embeddings can be …

Cultural cartography with word embeddings

DS Stoltz, MA Taylor - Poetics, 2021 - Elsevier
Using the frequency of keywords is a classic approach in the formal analysis of text, but has
the drawback of glossing over the relationality of word meanings. Word embedding models …

SemEval-2018 task 9: Hypernym discovery

J Camacho-Collados, CD Bovi, LE Anke… - Proceedings of the …, 2018 - aclanthology.org
This paper describes the SemEval 2018 Shared Task on Hypernym Discovery. We put
forward this task as a complementary benchmark for modeling hypernymy, a problem which …