TY - JOUR
T1 - Eye movements and the label feedback effect
T2 - Speaking modulates visual search via template integrity
AU - Hebert, Katherine P.
AU - Goldinger, Stephen D.
AU - Walenchok, Stephen C.
N1 - Funding Information:
Support provided by NIH / NICHD grant R01 HD075800-05 . This work was presented to the Society for the Neurobiology of Language in August 2016, and to the Psychonomic Society in November 2016. We thank Gary Lupyan and Kyle Cave for helpful comments, and Gia Veloria and Kayla Block for help in data collection.
Publisher Copyright:
© 2021 Elsevier B.V.
PY - 2021/5
Y1 - 2021/5
N2 - The label-feedback hypothesis (Lupyan, 2012) proposes that language modulates low- and high-level visual processing, such as priming visual object perception. Lupyan and Swingley (2012) found that repeating target names facilitates visual search, resulting in shorter response times (RTs) and higher accuracy. In the present investigation, we conceptually replicated and extended their study, using additional control conditions and recording eye movements during search. Our goal was to evaluate whether self-directed speech influences target locating (i.e. attentional guidance) or object perception (i.e., distractor rejection and target appreciation). In three experiments, during object search, people spoke target names, nonwords, irrelevant (absent) object names, or irrelevant (present) object names (all within-participants). Experiments 1 and 2 examined search RTs and accuracy: Speaking target names improved performance, without differences among the remaining conditions. Experiment 3 incorporated eye-tracking: Gaze fixation patterns suggested that language does not affect attentional guidance, but instead affects both distractor rejection and target appreciation. When search trials were conditionalized according to distractor fixations, language effects became more orderly: Search was fastest while people spoke target names, followed in linear order by the nonword, distractor-absent, and distractor-present conditions. We suggest that language affects template maintenance during search, allowing fluent differentiation of targets and distractors. Materials, data, and analyses can be retrieved here: https://osf.io/z9ex2/
AB - The label-feedback hypothesis (Lupyan, 2012) proposes that language modulates low- and high-level visual processing, such as priming visual object perception. Lupyan and Swingley (2012) found that repeating target names facilitates visual search, resulting in shorter response times (RTs) and higher accuracy. In the present investigation, we conceptually replicated and extended their study, using additional control conditions and recording eye movements during search. Our goal was to evaluate whether self-directed speech influences target locating (i.e. attentional guidance) or object perception (i.e., distractor rejection and target appreciation). In three experiments, during object search, people spoke target names, nonwords, irrelevant (absent) object names, or irrelevant (present) object names (all within-participants). Experiments 1 and 2 examined search RTs and accuracy: Speaking target names improved performance, without differences among the remaining conditions. Experiment 3 incorporated eye-tracking: Gaze fixation patterns suggested that language does not affect attentional guidance, but instead affects both distractor rejection and target appreciation. When search trials were conditionalized according to distractor fixations, language effects became more orderly: Search was fastest while people spoke target names, followed in linear order by the nonword, distractor-absent, and distractor-present conditions. We suggest that language affects template maintenance during search, allowing fluent differentiation of targets and distractors. Materials, data, and analyses can be retrieved here: https://osf.io/z9ex2/
KW - Label-feedback hypothesis
KW - Language
KW - Perception
KW - Top-down effects
KW - Visual search
UR - http://www.scopus.com/inward/record.url?scp=85099876934&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85099876934&partnerID=8YFLogxK
U2 - 10.1016/j.cognition.2021.104587
DO - 10.1016/j.cognition.2021.104587
M3 - Article
C2 - 33508577
AN - SCOPUS:85099876934
VL - 210
JO - Cognition
JF - Cognition
SN - 0010-0277
M1 - 104587
ER -