MJ Traxler - Trends in cognitive sciences, 2014 - cell.com
Syntactic parsing processes establish dependencies between words in a sentence. These dependencies affect how comprehenders assign meaning to sentence constituents …
Modeling human cognition is challenging because there are infinitely many mechanisms that can generate any given observation. Some researchers address this by constraining the …
Today's probabilistic language generators fall short when it comes to producing coherent and fluent text despite the fact that the underlying models perform well under standard …
ZG Cai, DA Haslett, X Duan, S Wang… - arXiv preprint arXiv …, 2023 - arxiv.org
Large language models (LLMs) and LLM-driven chatbots such as ChatGPT have shown remarkable capacities in comprehending and producing language. However, their internal …
Memory is fleeting. New material rapidly obliterates previous material. How, then, can the brain deal successfully with the continual deluge of linguistic input? We argue that, to deal …
The notion of prediction is studied in cognitive neuroscience with increasing intensity. We investigated the neural basis of 2 distinct aspects of word prediction, derived from …
The uniform information density (UID) hypothesis posits a preference among language users for utterances structured such that information is distributed uniformly across a signal …
Abstract Language-users reduce words in predictable contexts. Previous research indicates that reduction may be stored in lexical representation if a word is often reduced. Because …
Spoken language production involves selecting and assembling words and syntactic structures to convey one's message. Here we probe this process by analyzing natural …