-
Colour-concept association formation for novel concepts Visual Cognition (IF 1.875) Pub Date : 2022-07-21 Melissa A. Schoenlein, Karen B. Schloss
ABSTRACT Colour-concept associations influence fundamental processes in cognition and perception, including object recognition and visual reasoning. To understand these effects, it is necessary to understand how colour-concept associations are formed. It is assumed that colour-concept associations are learned through experiences, but questions remain concerning how association formation is influenced
-
Simultaneously and sequentially presented arrays evoke similar visual working memory crowding Visual Cognition (IF 1.875) Pub Date : 2022-07-18 Harun Yörük, Benjamin J. Tamber-Rosenau
ABSTRACT In visual crowding, an item representation is degraded by adjacent flanker items. Recently, the related phenomenon of visual working memory (VWM) crowding has been used to evaluate shared mechanisms between memory and perception. However, some previous studies that investigated VWM crowding suggested that it stemmed from encoding, rather than memory maintenance. In the current study, we evaluated
-
Exploring perceptual similarity and its relation to image-based spaces: an effect of familiarity Visual Cognition (IF 1.875) Pub Date : 2022-06-29 Rosyl S. Somai, Peter J.B. Hancock
ABSTRACT The lack of controlled stimuli transformations is an obstacle to the study of face identity recognition. Researchers are often limited to verbalizable transformations in the creation of a dataset. An alternative approach to verbalization for interpretability is finding image-based measures that allow us to quantify transformations. We explore whether PCA could be used to create controlled
-
On the origin of the Roelofs and induced Roelofs effects Visual Cognition (IF 1.875) Pub Date : 2022-06-27 Wladimir Kirsch
ABSTRACT When asked to align an object to what is perceived as straight ahead under conditions of an asymmetrical retinal stimulation observers systematically err in the direction of the centre of stimulation. This effect is known as Roelofs (or Dietzel-Roelofs) effect. When asked to judge the position of an object relative to an external reference under related conditions of sensory asymmetry a systematical
-
How preschoolers perceive danger – A study of inattentional bias Visual Cognition (IF 1.875) Pub Date : 2022-06-16 Feng Na, Zhang Hui, Wang Nan, Yan Congcong, Ji Qianru
ABSTRACT Harmful threats can sometimes appear unexpectedly in the lives of young children, whose limited experience leads to a greater risk of endangerment. The current study adapted the Variant Odd Ball protocol to explore the effects of threat and familiarity on inattentional blindness (IB). This research evaluated reactions to presentations of evolutionarily relevant images such as millipedes, snakes
-
Probing doors to visual awareness: Choice set, visibility, and confidence Visual Cognition (IF 1.875) Pub Date : 2022-06-15 Mario Martinez-Saito
ABSTRACT Visibility and confidence are two subtly (sensation-based and intuition-based) different ways to assess our visual experiences. We investigated how different choice sets affect the ability of confidence and visibility judgments to retrieve conscious information from perceptual decision-making processes. Six participants made introspective judgments on sinusoidal gratings at close to chance
-
Influence of physical features from peripheral vision on scene categorization in central vision Visual Cognition (IF 1.875) Pub Date : 2022-06-14 Audrey Trouilloud, Pauline Rossel, Cynthia Faurite, Alexia Roux-Sibilon, Louise Kauffmann, Carole Peyrin
ABSTRACT The spatial resolution of the human visual field decreases considerably from the center to the periphery. However, several studies have highlighted the importance of peripheral vision for scene categorization. In Experiment 1, we investigated if peripheral vision could influence the scene categorization in central vision. We used photographs of indoor and outdoor scenes from which we extracted
-
Cross-frequency coupling of frontal theta and posterior alpha is unrelated to the fidelity of visual long-term memory encoding Visual Cognition (IF 1.875) Pub Date : 2022-06-09 Chong Zhao, Keisuke Fukuda, Geoffrey F. Woodman
ABSTRACT Because visual long-term memory relies on multiple spatially distinct brain areas, encoding representations is likely to rely on networks formed via large-scale coupled neuronal oscillations. Decreases in occipital alpha power and increases in mid-frontal theta power appear to individually contribute to the encoding of visual long-term memories. Here we ask whether these oscillations form
-
Chunking by social relationship in working memory Visual Cognition (IF 1.875) Pub Date : 2022-05-06 Ilenia Paparella, Liuba Papeo
ABSTRACT Working memory (WM) uses knowledge and relations to organize and store multiple items in fewer structured units, or chunks. We investigated: (a) whether a crowd that exceeds the WM capacity is retained better if individuals can be grouped in social chunks; and (b) what counts as a social chunk: two individuals involved in a meaningful interaction or just spatially close and face-to-face. In
-
How fixation durations are affected by search difficulty manipulations Visual Cognition (IF 1.875) Pub Date : 2022-04-13 Daniel Ernst, Jeremy M. Wolfe
ABSTRACT Many eye tracking studies of visual search have focused on the role of the number of fixations and the nature of scan paths. Less attention has been paid to fixation durations and to how those durations are affected by stimulus features. Previous studies have shown that fixation durations can be as important as the number of fixations in explaining search times with complex stimuli (e.g.,
-
Subliminal emotional faces do not capture attention under high attentional load in a randomized trial presentation Visual Cognition (IF 1.875) Pub Date : 2022-04-06 Eda Tipura, Alan J. Pegna
ABSTRACT In spatial cueing paradigms, emotional stimuli generally orient attention more efficiently, leading to modulations of behavioural and neuronal responses according to stimulus valence. In a previous study, we showed that when emotional stimuli are not consciously perceived, they cannot orient attention and that emotion is not processed when attentional load is high. In the present studies,
-
Development of attentional bias towards visual word forms in the environment in preschool children Visual Cognition (IF 1.875) Pub Date : 2022-03-31 Tianying Qing, Ying Xiao, Huidong Xue, Wei Wang, Ming Ye, Jing Hu, Licheng Xue, Bing Chen, Yating Lv, Jing Zhao
ABSTRACT Environmental prints (e.g., the name “Mcdonald’s” on advertising boards) provide a visual environment rich in written words at the early stage of learning to read. Children’s attention to words is closely related to the process of learning to read. However, what remains unclear is how children’s attention to words in environmental prints develops and is related to their reading ability before
-
Are emojis processed visuo-spatially or verbally? Evidence for dual codes Visual Cognition (IF 1.875) Pub Date : 2022-03-30 Lauren A. Homann, Brady R. T. Roberts, Sara Ahmed, Myra A. Fernandes
ABSTRACT While some argue that emojis are processed like words, opponents note dissimilarities. We used a divided attention (DA) technique to examine whether memory for emojis, relative to words, engages primarily verbal or visuo-spatial cognitive representations. We compared the decline in memory output experienced when participants freely recalled a list of studied words or emojis under dual-task
-
Perception of opposite-direction motion in random dot kinematograms Visual Cognition (IF 1.875) Pub Date : 2022-03-14 Gi-Yeul Bae, Steven J. Luck
ABSTRACT Computational models of motion perception suggest that the perceived direction of weak motion signals may sometimes be directly opposite to the true stimulus motion direction. However, this possibility cannot be assessed by using standard 2AFC motion discrimination paradigms because two opposite directions of motion were used in most studies (e.g., leftward vs. rightward). We were able to
-
Spatial frequency bands used by patients with glaucoma to recognize facial expressions Visual Cognition (IF 1.875) Pub Date : 2022-03-10 Rémi Mathieu, Esther Hereth, Quentin Lenoble, Jean-François Rouland, Allison M. McKendrick, Muriel Boucart
ABSTRACT The authors investigated the influence of spatial frequencies in foveal vision in glaucomatous patients in a recognition task of facial expressions. Nineteen patients, 16 age-matched and 14 young controls saw centrally presented photographs of faces. Participants categorized the facial expressions as happy, angry or neutral. Two versions were tested: filtered faces of either low (LSF) or high
-
Obligatory integration of face features in expression discrimination Visual Cognition (IF 1.875) Pub Date : 2022-03-10 I. Muukkonen, M. Kilpeläinen, R. Turkkila, T. Saarela, V. Salmela
ABSTRACT Previous composite face studies have shown that an unattended face half that differs in identity or in expression from the attended face half distracts face perception. These studies have typically not controlled for the amount of information in different face halves. We investigated feature integration while participants discriminated angry and happy expressions. The stimuli were scaled using
-
Interactive Cognition: An introduction Visual Cognition (IF 1.875) Pub Date : 2022-03-07 Jelena Ristic, Francesca Capozzi
ABSTRACT Humans do not passively respond to agents and stimuli; we engage with our environments in reciprocal and interactive fashion. This special issue of Visual Cognition presents a collection of twelve original articles which demonstrate that interactive reciprocity changes the expression of basic cognitions and motor actions as well as resulting overt cognitive and social behaviours. The results
-
Correction Visual Cognition (IF 1.875) Pub Date : 2022-03-07
(2022). Correction. Visual Cognition: Vol. 30, Visual Cognition on Interactive Cognition, pp. 85-85.
-
Revisiting the role of visual working memory in attentional control settings Visual Cognition (IF 1.875) Pub Date : 2022-02-28 Lindsay Plater, Blaire Dube, Maria Giammarco, Kirsten Donaldson, Krista Miller, Naseem Al-Aidroos
ABSTRACT Observers adopt attentional control setting (ACS) based on their goals; stimuli that match the current goal will capture attention, whereas stimuli that do not match the current goal will not. In the present study, we revisited the role of VWM in maintaining ACSs capable of guiding attentional capture. Participants completed a Posner cueing task while either remembering a colour (Experiments
-
Does motor noise contaminate estimates of the precision of visual working memory? Visual Cognition (IF 1.875) Pub Date : 2022-02-28 David Sutterer, Christina G. Rosca, Geoffrey F. Woodman
ABSTRACT The continuous-report task, in which subjects report the colour of a visual working memory representation by clicking on a colour wheel, has become the gold standard for measuring the precision of representations stored in visual working memory. This task requires fine motor control, typically with a mouse, but the precision of responses have been interpreted as being entirely due to the precision
-
A horizontal–vertical anisotropy in spatial short-term memory Visual Cognition (IF 1.875) Pub Date : 2022-02-24 Daniel T. Smith
ABSTRACT Visual perception and saccadic eye-movements are more precise when directed at isoeccentric locations along the horizontal compared to vertical meridian. This effect is known as horizontal-vertical anisotropy (HVA). Given that the eye-movement system plays an important role in spatial short-term memory (STM) it was hypothesized that spatial STM would also show a horizontal-vertical anisotropy
-
Correction Visual Cognition (IF 1.875) Pub Date : 2022-02-01
(2022). Correction. Visual Cognition: Vol. 30, Visual Cognition on Interactive Cognition, pp. 28-28.
-
Sometimes it helps to be taken out of context: Memory for objects in scenes Visual Cognition (IF 1.875) Pub Date : 2022-01-24 Karla K. Evans, Jeremy M. Wolfe
ABSTRACT It is well known that humans demonstrate massive and surprisingly rich recognition memory for objects and/or scenes and that context typically aids retrieval of episodic memories. However, when we combine picture memory for 100 objects with the context in the form of a background scene, we find that irrelevant contexts lead to substantial impairments of object memory. Twelve experiments used
-
On the relationship between cognitive load and the efficiency of distractor rejection in visual search: The case of motion-form conjunctions Visual Cognition (IF 1.875) Pub Date : 2021-12-27 Kevin Dent
ABSTRACT Search for a target defined by a conjunction of movement and shape (moving X amongst moving Os and static Xs) is efficient, with static distractors contributing little to RT. How search is restricted to the moving items, whilst static items are ignored is not fully understood. Whether, passive bottom-up, or active top-down control processes are recruited is unknown. The current study addressed
-
Thematic role tracking difficulties across multiple visual events influences role use in language production Visual Cognition (IF 1.875) Pub Date : 2021-12-27 Andrew Jessop, Franklin Chang
ABSTRACT Language sometimes requires tracking the same participant in different thematic roles across multiple visual events (e.g., The girl that another girl pushed chased a third girl). To better understand how vision and language interact in role tracking, participants described videos of multiple randomly moving circles where two push events were presented. A circle might have the same role in
-
Unmasking the effects of orthography, semantics, and phonology on 2AFC visual word perceptual identification Visual Cognition (IF 1.875) Pub Date : 2021-10-14 Shaylyn Kress, Josh Neudorf, Chelsea Ekstrand, Ron Borowsky
ABSTRACT In the two-alternative forced-choice (2AFC) task, the target stimulus is presented very briefly, and participants must choose which of two options was the presented target. Some past research has assumed that the 2AFC task isolates orthographic effects, despite orthographic, semantic, and phonological differences between the options. If so, performance should not differ between word/nonword
-
Sequential effects in facial attractiveness judgments: Separating perceptual and response biases Visual Cognition (IF 1.875) Pub Date : 2021-11-02 Robin S. S. Kramer, Lyndsay R. Pustelnik
ABSTRACT When items are presented sequentially, the evaluation of the current item is biased by both the previous item’s value (perceptual bias) and the previous response given (response bias). While these biases have been identified in judgements of facial attractiveness, it is unclear as to whether they produce assimilation and/or contrast effects. Here, two tasks were employed to measure each bias
-
When a stranger becomes a friend: Measuring the neural correlates of real-world face familiarisation Visual Cognition (IF 1.875) Pub Date : 2021-11-24 Alison Campbell, James W. Tanaka
ABSTRACT Humans can readily and effortlessly learn new faces encountered in the social environment. As a face transitions from unfamiliar to familiar, the ability to generalize across different images of the same person increases substantially. Fast periodic visual stimulation and EEG (FPVS-EEG) was used to isolate identity-specific responses that generalize across different images of the same person
-
Learning and recognizing facial identity in variable images: New insights from older adults Visual Cognition (IF 1.875) Pub Date : 2021-11-21 Claire M. Matthews, Catherine J. Mondloch
ABSTRACT Recent research has emphasized the importance of using images that incorporate natural variability in appearance (i.e., ambient images) to assess face learning and recognition. Across five tasks, we provide the first examination of older adults’ face learning and recognition in ambient images. Young and older adults showed comparable performance in three tasks: when recognizing a familiar
-
Motor behaviour mimics the gaze response in establishing joint attention, but is moderated by individual differences in adopting the intentional stance towards a robot avatar Visual Cognition (IF 1.875) Pub Date : 2021-11-03 Cesco Willemse, Abdulaziz Abubshait, Agnieszka Wykowska
ABSTRACT Leading another person’s gaze to establish joint attention facilitates social interaction. Previously it was found that we look back at agents who engage in joint attention quicker than at agents who display this behaviour less frequently. This paper serves to fill in two remaining knowledge gaps. Firstly, we examined whether this looking-back behaviour is replicated by a manual response.
-
Why signal suppression cannot resolve the attentional capture debate Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Martin Eimer
ABSTRACT Luck et al. (2021 Luck, S. J., Gaspelin, N., Folk, C. L., Remington, R. W., & Theeuwes, J. (2021). Progress toward resolving the attentional capture debate. Visual Cognition, 29(1), 1–21. https://doi.org/10.1080/13506285.2020.1848949[Taylor & Francis Online], [Web of Science ®] , [Google Scholar], Progress toward resolving the attentional capture debate. Visual Cognition, 29(1), 1–21. https://doi
-
The progress revisited: How the dispute between stimulus-driven and contingent-capture advocates is hampered by a blindness for change Visual Cognition (IF 1.875) Pub Date : 2021-10-19 Mieke Donk
ABSTRACT Luck, Gaspelin, Folk, Remington, and Theeuwes [2021. Progress toward resolving the attentional capture debate. Visual Cognition, 29(1), 1–21. https://doi.org/10.1080/13506285.2020.1848949] argue that the debate regarding attentional capture has changed in such a way that there is now some consensus. However, even though a certain degree of agreement has been reached on the question whether
-
How do competing influences of selection history interact? A commentary on Luck et al. (2021) Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Daniel Pearson, Poppy Watson, Mike E. Le Pelley
ABSTRACT Attention researchers have long debated the role that attentional control settings play in determining selection. In their article, Luck et al. (2021) have identified points of consensus among traditionally opposed models of attentional control, one of which is that prior experience (i.e., selection history) allows attentional control settings to suppress attention to salient distractors by
-
Beyond guidance: It’s time to focus on the consequences of attentional capture Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Alon Zivony
ABSTRACT Attentional capture is assumed to automatically trigger attentional engagement, which gates working memory access. However, recent studies show that engagement is not a necessary outcome of capture and can be disrupted by manipulations that leave capture intact. In this commentary, I suggest that these findings have important implications for the capture debate. Mainly, they suggest that capture
-
Consensus emerges and biased competition wins: A commentary on Luck et al. (2021) Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Carly J. Leonard
ABSTRACT As the debate about the automaticity of attentional capture has raged on over the years, knowledge of how selection functions has been accumulating in the literature. Luck et al. (2021) highlights emerging areas of consensus, including support for multiple mechanisms that influence attentional guidance. Here, I revisit the biased competition model of [Desimone, R., & Duncan, J. (1995). Neural
-
Passive distractor filtering in visual search Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Bo-Yeong Won
ABSTRACT Luck, Gaspelin, Folk, Remington, & Theeuwes [(in press). Progress Toward Resolving the Attentional Capture Debate. Visual Cognition] proposes that proactive attentional mechanisms suppress a salient distractor by top-down inhibitory signals. However, considering the abundance of distracting information surrounding us, an automatic and passive distractor filtering system that requires little
-
A bridge to progress further afield: The promise of a common framework on attentional capture Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Steven B. Most, Kim M. Curby
ABSTRACT Although physical salience looms large in the attentional capture literature, stimuli can also capture attention via salience deriving from non-physical factors. Such psychological salience can stem, for example, from the emotional resonance of stimuli or their relevance to a person’s expertise. We consider how insights from a recently proposed framework for attentional capture can be used
-
The malleability of attentional capture Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Han Zhang, Tessa R. Abagis, John Jonides
ABSTRACT We suggest that consideration of trial-by-trial variations, individual differences, and training data will enrich the current framework in Luck, S. J., Gaspelin, N., Folk, C. L., Remington, R. W., & Theeuwes, J. (2020. Progress toward resolving the attentional capture debate. Visual Cognition, 1–21. https://doi.org/10.1080/13506285.2020.1848949). We consider whether attentional capture is
-
Neural evidence for dynamic within-trial changes in allocation of visual attention Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Tobias Feldmann-Wüstefeld
ABSTRACT In their article “Progress toward resolving the attentional capture debate”, Luck et al. (2021. Progress toward resolving the attentional capture debate. Visual Cognition, 29(1), 1–21.) aim at reconciling stimulus-driven, goal-driven and signal suppression accounts of visual attention. At the center of their model is a “control state” that determines activations on a priority map and thus
-
Do we need attentional suppression? Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Dirk Kerzel, Stanislas Huynh Cong, Nicolas Burra
ABSTRACT Gaspelin and Luck describe the signal suppression hypothesis, which proposes that attentional suppression prevents the capture of visual attention by salient distractors. We will discuss several problems with this proposal. On a theoretical level, we will argue that attentional suppression is a dispensable mechanism. Most effects of attentional suppression can be easily explained by reduced
-
The moment-by-moment attentional temperature: How do history effects influence attentional capture? Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Árni Kristjánsson, Árni Gunnar Ásgeirsson
ABSTRACT We agree with [Luck, S. J., Gaspelin, N., Folk, C. L., Remington, R. W., & Theeuwes, J. (2021). Progress toward resolving the attentional capture debate. Visual Cognition, 29(1), 1–21] that selection history will play a role in explaining attentional capture. But their conception of history effects (or priming) lacks a clear explication of how attentional history fulfils the crucial role of
-
Standing out in a small crowd: The role of display size in attracting attention Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Seah Chang, Ernst Niebur, Howard E. Egeth
ABSTRACT Strong evidence supporting the top-down modulation of attention has come from studies in which participants learned to suppress a singleton in a heterogeneous four-item display. These studies have been criticized on the grounds that the displays are so sparse that the singleton is not actually salient. We argue that similar evidence of suppression has been found with substantially larger displays
-
Dividing attentional capture Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Naseem Al-Aidroos
ABSTRACT Is capture automatic or under our control? In a recent review, Luck and colleagues [Luck, S. J., Gaspelin, N., Folk, C. L., Remington, R. W., & Theeuwes, J. (2021). Progress toward resolving the attentional capture debate. Visual Cognition, 29(1), 1–21.] provide an important milestone for gauging our (attention researchers) progress with this question. While it may sometimes feel like we have
-
Foxes, hedgehogs, and attentional capture Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Clayton Hickey, Wieske van Zoest
ABSTRACT Isaiah Berlin famously suggested that thinkers can be characterized into two groups, foxes and hedgehogs. Foxes multiply ideas, hedgehogs stretch them. Hedgehoggy thinking, based around core universal principles, has historically dominated capture research and is prominent in Luck, S. J., Gaspelin, N., Folk, C. L., Remington, R. W., & Theeuwes, J. (2021. Progress toward resolving the attentional
-
Attentional capture: An ameliorable side-effect of searching for salient targets Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Heinrich R. Liesefeld, Anna M. Liesefeld, Hermann J. Müller
ABSTRACT This commentary highlights that some of the remaining discrepancies in the attentional-capture debate can be resolved by a simple assumption: observers do not use the priority map when this map is useless to solve the task. Rather, whenever search targets are known to be non-salient, observers resort to a previously postulated alternative search strategy for which (distractor) saliency signals
-
What do we know about suppression of attention capture? Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Eric Ruthruff, Christopher Hauck, Mei-Ching Lien
ABSTRACT Luck et al. [(2021). Progress toward resolving the attentional capture debate. Visual Cognition, 29(1), 1–21. https://doi.org/10.1080/13506285.2020.1848949] proposed singleton suppression as a promising resolution to the attention capture debate. Specifically, salient singletons are assumed to generate an “attend-to-me” signal and therefore represent a threat that triggers suppression. One
-
Unresolved issues in distractor suppression: Proactive and reactive mechanisms, implicit learning, and naturalistic distraction Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Joy J. Geng, Shea E. Duarte
ABSTRACT We acknowledge the empirical and theoretical advancements described within Luck et al. and commend the integration of viewpoints on the debate over attentional capture by salient distractors. Our commentary seeks to build on the conversation by drawing attention to open questions that remain about how proactive and reactive mechanisms might operate, the mechanisms of implicit learning, and
-
Understanding of attentional suppression is incomplete without consideration of motivation and context Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Andrew B. Leber
ABSTRACT Luck, Gaspelin, Folk, Remington, and Theeuwes [(2021). Progress toward resolving the attentional capture debate. Visual Cognition, 29(1), 1–21. doi:10.1080/13506285.2020.1848949] provide a valuable status report on our collective understanding of attentional capture, and they succeed at identifying common ground and articulating persisting points of discord. Here, I contribute two points.
-
Within and beyond an integrated framework of attentional capture: A perspective from cognitive-affective neuroscience Visual Cognition (IF 1.875) Pub Date : 2021-09-28 James H. Kryklywy, Maria G. M. Manaligod, Rebecca M. Todd
ABSTRACT The integrative framework proposed by Luck and colleagues [Luck, S. J., Gaspelin, N., Folk, C. L., Remington, R. W., & Theeuwes, J. (2021). Progress toward resolving the attentional capture debate. Visual Cognition, 29(1), 1–21. https://doi.org/10.1080/13506285.2020.1848949] represents major progress in the field of attention research, and remaining areas of disagreement provide an opportunity
-
Invited commentary: Attentional capture and its suppression viewed as skills Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Rebecca Rosa Schmid, Christian Büsel, Ulrich Ansorge
ABSTRACT Attention – here, the selection of visual information – is necessary for the control of skills and procedures. In amendment of the principles discussed by Luck et al. [Luck, S. J., Gaspelin, N., Folk, C. L., Remington, R. W., & Theeuwes, J. (2021). Progress toward resolving the attentional capture debate. Visual Cognition, 29(1), 1–21. https://doi.org/10.1080/13506285.2020.1848949], we review
-
Voluntary choice tasks increase control settings and reduce capture Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Dion T. Henare, Anna Schubö
ABSTRACT In their article, Luck et al. ([2021]. Progress toward resolving the attentional capture debate. Visual Cognition, 29(1), 1–21. https://doi.org/10.1080/13506285.2020.1848949) outline alternative hypotheses regarding the cognitive mechanisms that modulate attention capture. This commentary addresses how control signals can be used to modulate feature gain control prior to saliency computations
-
Attention and distraction in the predictive brain Visual Cognition (IF 1.875) Pub Date : 2021-09-28 Heleen A. Slagter, Dirk van Moorselaar
ABSTRACT Whether it is possible to ignore a physically salient distractor has been a topic of active debate over the past 25 years, with empirical evidence for and against each of the theoretical stances. We put forward that predictive processing may provide a unified theoretical perspective that can account reasonably well for the empirical literature on attentional capture. In this perspective, capture
-
Response to commentaries to Luck et al. (2021). Progress toward resolving the attentional capture debate Visual Cognition (IF 1.875) Pub Date : 2021-10-19 Jan Theeuwes
ABSTRACT In a review paper, Luck et al. [(2021). Progress toward resolving the attentional capture debate. Visual Cognition, 29(1), 1–21. https://doi.org/10.1080/13506285.2020.1848949] discussed multiple perspectives on the attentional capture debate. In response to this review paper, several commentaries were written. Here, I respond to these commentaries and discuss those issues that seem to be the
-
Themes and variations: A response to commentaries on Luck, et al. (2021) Visual Cognition (IF 1.875) Pub Date : 2021-10-19 Roger Remington, Charles L. Folk
ABSTRACT The many commentaries on Luck [Luck, S. J., Gaspelin, N., Folk, C. L., Remington, R. W., & Theeuwes, J. (2020). Progress toward resolving the attentional capture debate. Visual Cognition, 1–21] were thoughtful, insightful and important to future research in this area. In this response, we point out common themes and provide clarification and commentary on issues we find central to our perspective
-
Progress and remaining issues: A response to the commentaries on Luck et al. (2021) Visual Cognition (IF 1.875) Pub Date : 2021-09-24 Nicholas Gaspelin, Steven J. Luck
ABSTRACT Luck et al. (2021 Luck, S. J., Gaspelin, N., Folk, C. L., Remington, R. W., & Theeuwes, J. (2021). Progress toward resolving the attentional capture debate. Visual Cognition, 29(1), 1–21. https://doi.org/10.1080/13506285.2020.1848949[Taylor & Francis Online], [Web of Science ®] , [Google Scholar]) reviewed evidence that observers can learn to suppress attentional capture by salient distractors
-
Gender and perceived cooperation modulate visual attention in a joint spatial cueing task Visual Cognition (IF 1.875) Pub Date : 2021-10-06 Miles R.A. Tufft, Matthias S. Gobel
ABSTRACT This research investigated how interactive social contexts shape basic visual attention. It has been shown that social information can modulate inhibition of return effects in joint spatial cueing tasks. We predicted that if perceptions of cooperativeness explain this phenomenon, we would then observe larger inhibition of return effects for more cooperative individuals and in highly cooperative
-
The influence of social and emotional context on the gaze leading orienting effect Visual Cognition (IF 1.875) Pub Date : 2021-10-04 S. Gareth Edwards, Megan Rudrum, Katrina L. McDonough, Andrew P. Bayliss
ABSTRACT We spontaneously orient our attention towards people whose gaze we have led (the “gaze leading” effect). Here, we investigated whether this orienting effect is sensitive to the social and emotional content of the stimuli within the interactions. Experiment 1 replicated the gaze leading effect but found no reliable influence of facial dominance or object valence. Experiment 2, where only object
-
The effect of colour matching on perceptual integration of pictures and frames Visual Cognition (IF 1.875) Pub Date : 2021-08-11 Harrison Adler, Helene Intraub
ABSTRACT A black frame presented around one of 12 pictures during rapid serial visual presentation (RSVP) was frequently misperceived as surrounding the preceding or following picture in the sequence (temporal migration; Intraub, 1985 Intraub, H. (1985). Visual dissociation: An illusory conjunction of pictures and forms. Journal of Experimental Psychology: Human Perception and Performance, 11(4), 431–442
-
Satisfaction-of-Search (SOS) impacts multiple-target searches during proofreading: Evidence from eye movements Visual Cognition (IF 1.875) Pub Date : 2021-09-12 Eliza Barach, Leah Gloskey, Heather Sheridan
ABSTRACT In multiple-target visual searches, subsequent search misses (SSMs; [Cain, M. S., Adamo, S. H., & Mitroff, S. R. (2013). A taxonomy of errors in multiple-target visual search. Visual Cognition, 21(7), 899–921. https://doi.org/10.1080/13506285.2013.843627]) occur when the discovery of one target hinders detection of another target (formerly referred to as Satisfaction of Search [Tuddenham,
-
The role of low spatial frequencies in facial emotion processing: A study on anorthoscopic perception Visual Cognition (IF 1.875) Pub Date : 2021-08-24 Vincenza Tommasi, Giulia Prete, Luca Tommasi
ABSTRACT We examined the interaction among emotions, spatial frequencies (SF) and holistic analysis using an anorthoscopic paradigm, which is supposed to alter the holistic processing. Emotional and neutral faces were presented sliding behind a narrow aperture (anorthoscopy), or statically in the centre of the screen (whole-face presentation) as broadband images (Experiment 1), filtered at low-SF (Experiment