[PDF][PDF] Modern language models refute Chomsky's approach to language

S Piantadosi - Lingbuzz Preprint, lingbuzz, 2023 - lingbuzz.net
The rise and success of large language models undermines virtually every strong claim for
the innateness of language that has been proposed by generative linguistics. Modern …

Large language models demonstrate the potential of statistical learning in language

P Contreras Kallens… - Cognitive …, 2023 - Wiley Online Library
To what degree can language be acquired from linguistic input alone? This question has
vexed scholars for millennia and is still a major focus of debate in the cognitive science of …

Examining the inductive bias of neural language models with artificial languages

JC White, R Cotterell - arXiv preprint arXiv:2106.01044, 2021 - arxiv.org
Since language models are used to model a wide variety of languages, it is natural to ask
whether the neural architectures used for the task have inductive biases towards modeling …

Language model evaluation beyond perplexity

C Meister, R Cotterell - arXiv preprint arXiv:2106.00085, 2021 - arxiv.org
We propose an alternate approach to quantifying how well language models learn natural
language: we ask how well they match the statistical tendencies of natural language. To …

The probabilistic analysis of language acquisition: Theoretical, computational, and experimental analysis

AS Hsu, N Chater, PMB Vitányi - Cognition, 2011 - Elsevier
There is much debate over the degree to which language learning is governed by innate
language-specific biases, or acquired through cognition-general principles. Here we …

What artificial neural networks can tell us about human language acquisition

A Warstadt, SR Bowman - Algebraic structures in natural …, 2022 - taylorfrancis.com
Rapid progress in machine learning for natural language processing has the potential to
transform debates about how humans learn language. However, the learning environments …

Structures, not strings: Linguistics as part of the cognitive sciences

MBH Everaert, MAC Huybregts, N Chomsky… - Trends in cognitive …, 2015 - cell.com
There are many questions one can ask about human language: its distinctive properties,
neural representation, characteristic uses including use in communicative contexts …

Physics of language models: Part 1, context-free grammar

Z Allen-Zhu, Y Li - arXiv preprint arXiv:2305.13673, 2023 - arxiv.org
We design experiments to study $\textit {how} $ generative language models, like GPT, learn
context-free grammars (CFGs)--diverse language systems with a tree-like structure capturing …

Do large language models understand us?

BA y Arcas - Daedalus, 2022 - direct.mit.edu
Large language models (LLMs) represent a major advance in artificial intelligence and, in
particular, toward the goal of human-like artificial general intelligence. It is sometimes …

[PDF][PDF] Inducing tree-substitution grammars

T Cohn, P Blunsom, S Goldwater - The Journal of Machine Learning …, 2010 - jmlr.org
Inducing a grammar from text has proven to be a notoriously challenging learning task
despite decades of research. The primary reason for its difficulty is that in order to induce …