作者
Sarah E Anderson, Eric Chiu, Stephanie Huette, Michael J Spivey
发表日期
2011/6/1
来源
Acta Psychologica
卷号
137
期号
2
页码范围
181-189
出版商
North-Holland
简介
Recent converging evidence suggests that language and vision interact immediately in non-trivial ways, although the exact nature of this interaction is still unclear. Not only does linguistic information influence visual perception in real-time, but visual information also influences language comprehension in real-time. For example, in visual search tasks, incremental spoken delivery of the target features (e.g., “Is there a red vertical?”) can increase the efficiency of conjunction search because only one feature is heard at a time. Moreover, in spoken word recognition tasks, the visual presence of an object whose name is similar to the word being spoken (e.g., a candle present when instructed to “pick up the candy”) can alter the process of comprehension. Dense sampling methods, such as eye-tracking and reach-tracking, richly illustrate the nature of this interaction, providing a semi-continuous measure of the temporal …
引用总数
2011201220132014201520162017201820192020202120222023202427365557251321
学术搜索中的文章