Semantic memory: A review of methods, models, and current challenges

AA Kumar - Psychonomic Bulletin & Review, 2021 - Springer
Adult semantic memory has been traditionally conceptualized as a relatively static memory
system that consists of knowledge about the world, concepts, and symbols. Considerable …

Multimodal machine learning: A survey and taxonomy

T Baltrušaitis, C Ahuja… - IEEE transactions on …, 2018 - ieeexplore.ieee.org
Our experience of the world is multimodal-we see objects, hear sounds, feel texture, smell
odors, and taste flavors. Modality refers to the way in which something happens or is …

How contextual are contextualized word representations? Comparing the geometry of BERT, ELMo, and GPT-2 embeddings

K Ethayarajh - arXiv preprint arXiv:1909.00512, 2019 - arxiv.org
Replacing static word embeddings with contextualized word representations has yielded
significant improvements on many NLP tasks. However, just how contextual are the …

Deep multimodal fusion by channel exchanging

Y Wang, W Huang, F Sun, T Xu… - Advances in neural …, 2020 - proceedings.neurips.cc
Deep multimodal fusion by using multiple sources of data for classification or regression has
exhibited a clear advantage over the unimodal counterpart on various applications. Yet …

Experience grounds language

Y Bisk, A Holtzman, J Thomason, J Andreas… - arXiv preprint arXiv …, 2020 - arxiv.org
Language understanding research is held back by a failure to relate language to the
physical world it describes and to the social interactions it facilitates. Despite the incredible …

Conceptnet 5.5: An open multilingual graph of general knowledge

R Speer, J Chin, C Havasi - Proceedings of the AAAI conference on …, 2017 - ojs.aaai.org
Abstract Machine learning about language can be improved by supplying it with specific
knowledge and sources of external information. We present here a new version of the linked …

Evaluating word embedding models: Methods and experimental results

B Wang, A Wang, F Chen, Y Wang… - APSIPA transactions on …, 2019 - cambridge.org
Extensive evaluation on a large number of word embedding models for language
processing applications is conducted in this work. First, we introduce popular word …

A survey of cross-lingual word embedding models

S Ruder, I Vulić, A Søgaard - Journal of Artificial Intelligence Research, 2019 - jair.org
Cross-lingual representations of words enable us to reason about word meaning in
multilingual contexts and are a key facilitator of cross-lingual transfer when developing …

Vector-space models of semantic representation from a cognitive perspective: A discussion of common misconceptions

F Günther, L Rinaldi, M Marelli - … on Psychological Science, 2019 - journals.sagepub.com
Models that represent meaning as high-dimensional numerical vectors—such as latent
semantic analysis (LSA), hyperspace analogue to language (HAL), bound encoding of the …

Using the output embedding to improve language models

O Press, L Wolf - arXiv preprint arXiv:1608.05859, 2016 - arxiv.org
We study the topmost weight matrix of neural network language models. We show that this
matrix constitutes a valid word embedding. When training language models, we recommend …