Dissociating language and thought in large language models

K Mahowald, AA Ivanova, IA Blank, N Kanwisher… - Trends in Cognitive …, 2024 - cell.com
Large language models (LLMs) have come closest among all models to date to mastering
human language, yet opinions about their linguistic and cognitive capabilities remain split …

The language network as a natural kind within the broader landscape of the human brain

E Fedorenko, AA Ivanova, TI Regev - Nature Reviews Neuroscience, 2024 - nature.com
Abstract Language behaviour is complex, but neuroscientific evidence disentangles it into
distinct components supported by dedicated brain areas or networks. In this Review, we …

An investigation across 45 languages and 12 language families reveals a universal language network

S Malik-Moraleda, D Ayyash, J Gallée, J Affourtit… - Nature …, 2022 - nature.com
To understand the architecture of human language, it is critical to examine diverse
languages; however, most cognitive neuroscience research has focused on only a handful …

Brains and algorithms partially converge in natural language processing

C Caucheteux, JR King - Communications biology, 2022 - nature.com
Deep learning algorithms trained to predict masked words from large amount of text have
recently been shown to generate activations similar to those of the human brain. However …

Driving and suppressing the human language network using large language models

G Tuckute, A Sathe, S Srikant, M Taliaferro… - Nature Human …, 2024 - nature.com
Transformer models such as GPT generate human-like language and are predictive of
human brain responses to language. Here, using functional-MRI-measured brain responses …

Probabilistic atlas for the language network based on precision fMRI data from> 800 individuals

B Lipkin, G Tuckute, J Affourtit, H Small, Z Mineroff… - Scientific Data, 2022 - nature.com
Two analytic traditions characterize fMRI language research. One relies on averaging
activations across individuals. This approach has limitations: because of inter-individual …

Spatiotemporally distributed frontotemporal networks for sentence reading

O Woolnough, C Donos, E Murphy… - Proceedings of the …, 2023 - National Acad Sciences
Reading a sentence entails integrating the meanings of individual words to infer more
complex, higher-order meaning. This highly rapid and complex human behavior is known to …

Event knowledge in large language models: the gap between the impossible and the unlikely

C Kauf, AA Ivanova, G Rambelli, E Chersoni… - Cognitive …, 2023 - Wiley Online Library
Word co‐occurrence patterns in language corpora contain a surprising amount of
conceptual knowledge. Large language models (LLMs), trained to predict words in context …

The domain-general multiple demand (MD) network does not support core aspects of language comprehension: a large-scale fMRI investigation

E Diachek, I Blank, M Siegelman, J Affourtit… - Journal of …, 2020 - Soc Neuroscience
Aside from the language-selective left-lateralized frontotemporal network, language
comprehension sometimes recruits a domain-general bilateral frontoparietal network …

Lexical-semantic content, not syntactic structure, is the main contributor to ANN-brain similarity of fMRI responses in the language network

C Kauf, G Tuckute, R Levy, J Andreas… - Neurobiology of …, 2024 - direct.mit.edu
Abstract Representations from artificial neural network (ANN) language models have been
shown to predict human brain activity in the language network. To understand what aspects …