We report a comprehensive review of the published reading studies on retrieval interference in reflexive-/reciprocal-antecedent and subject-verb dependencies. We also provide a …
This paper explores the knowledge of linguistic structure learned by large artificial neural networks, trained via self-supervision, whereby the model simply tries to predict a masked …
V Dentella, F Günther… - Proceedings of the …, 2023 - National Acad Sciences
Humans are universally good in providing stable and accurate judgments about what forms part of their language and what not. Large Language Models (LMs) are claimed to possess …
Inferences about hypotheses are ubiquitous in the cognitive sciences. Bayes factors provide one general way to compare different hypotheses by their compatibility with the observed …
Modifiers and modification have been a major focus of inquiry for as long as the formal study of semantics has existed, and remain at the heart of major theoretical debates in the field …
J Sprouse - Behavior research methods, 2011 - Springer
Abstract Amazon's Mechanical Turk (AMT) is a Web application that provides instant access to thousands of potential participants for survey-based psychology experiments, such as the …
We investigated the relationship between linguistic representation and memory access by comparing the processing of two linguistic dependencies that require comprehenders to …
It is well-known in statistics (eg, Gelman & Carlin, 2014) that treating a result as publishable just because the p-value is less than 0.05 leads to overoptimistic expectations of …
S Lewis, C Phillips - Journal of Psycholinguistic Research, 2015 - Springer
We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models …