Bayesian explanations have swept through cognitive science over the past two decades, from intuitive physics and causal learning, to perception, motor control and language. Yet …
Mental representations remain the central posits of psychology after many decades of scrutiny. However, there is no consensus about the representational format (s) of biological …
People learning new concepts can often generalize successfully from just a single example, yet machine learning algorithms typically require tens or hundreds of examples to perform …
Inductive reasoning is a core problem-solving capacity: humans can identify underlying principles from a few examples, which can then be robustly generalized to novel scenarios …
BM Lake, GL Murphy - Psychological review, 2023 - psycnet.apa.org
Abstract Machines have achieved a broad and growing set of linguistic competencies, thanks to recent progress in Natural Language Processing (NLP). Psychologists have …
To tackle a hard problem, it is often wise to reuse and recombine existing knowledge. Such an ability to bootstrap enables us to grow rich mental concepts despite limited cognitive …
Much of learning and reasoning occurs in pedagogical situations—situations in which a person who knows a concept chooses examples for the purpose of helping a learner …
Modern generative models exhibit unprecedented capabilities to generate extremely realistic data. However, given the inherent compositionality of real world, reliable use of …
M Jones, BC Love - Behavioral and brain sciences, 2011 - cambridge.org
The prominence of Bayesian modeling of cognition has increased recently largely because of mathematical advances in specifying and deriving predictions from complex probabilistic …