Main Content

Visual information sampling and attentional control

Humans are usually quite efficient in attentional selection: they manage to focus their processing capacity on relevant information and can ignore a vast amount of non-relevant information. However, most of the knowledge gained on attentional selection stems from experiments performed in controlled laboratory settings with simplified stimuli, and predefined targets and distractors. Do humans use similar mechanisms when sampling visual information in natural, unstructured environments? In real-world scenarios, observers often perform several tasks at the same time and have to decide, which signals are  relevant or might become relevant in the near future. How do humans decide where to attend and how do they interpret their visual environment to optimize their behavior?

We investigate how observers adapt their attentional control settings to optimize information sampling under dynamically changing conditions. We want to learn more about the (strategic) factors that influence this adaptation.

In particular, our projects focus on

  • Visual Foraging as a tool for understanding attentional strategies [learn more]
  • Target choice in dynamically changing environments [learn more]
  • Context learning and visual attention [learn more]
  • Attentional control in the context of social perception: Victim sensitivity and eye movements [learn more]

                        

Visual foraging as a tool for understanding attentional strategies

Principal investigators: Jan Tünnermann, Kevin Hartung, Anna Schubö

For humans and other organisms, visual foraging is an important behavior to acquire resources (e.g. food). Active vision and selective visual attention facilitates selecting target objects, especially in crowded scenes. Therefore, computer-based virtual foraging tasks provide a valuable tool to assess attentional control. Selection histories in multi-item displays reveal how participants influence their attention to trade off various strategic factors.

For instance, switching search templates appears to require some effort: Experiments conducted in the 1970ies showed that chicken tend to pick grains of the same color over a longer period, even if the grains are equally tasty as (differently colored) alternatives. We conduct computer-based experiments to investigate visual foraging and attentional strategies in humans. In these experiments participants typically have to select target items and avoid distractor items under continuously changing conditions (e.g., gradual change of target—distractor similarity or frequency).Participants forage using touch screen responses, a virtual stylus, or eye movements.

Related literature:

Tünnermann, J., & Schubö, A. (2019). Foraging with dynamic stimuli. Journal of Vision (VSS).

Bergmann, N., Tünnermann, J., & Schubö, A. (2019). Which search are you on? Adapting to color while searching for shape. Attention, Perception & Psychophysics, 16(3).

Kristjánsson, Á., Jóhannesson, Ó. I., & Thornton, I. M. (2014). Common attentional constraints in visual foraging. PLOS ONE, 9, e100752.

Target choice in dynamically changing environments

Method: Behavioral & Eye-Tracking
Project: Target choice in dynamically changing environments
Contact: Jan Tünnermann, Nils Bergmann, Yun Yun Mu, Anna Schubö
Abstract: Intentions and external stimuli in the environment guide human visual attention. When the visual environment changes gradually, observers can potentially adjust to regularities in these changes and make strategic use of them. For instance, if observers can choose between different targets (e.g., a red one or a blue one), they might prefer targets dissimilar to distracting elements. If these distracting elements change predictably over time (e.g., the number of bluish distractors decreases while that of reddish ones increases), observers might adjust their target preference accordingly, switching from one target color to the other as it becomes “more unique“.

Related literature:

Bergmann, N., Tünnermann, J., & Schubö, A. (2019). Which search are you on? Adapting to color while searching for shape. Attention, Perception & Psychophysics, 16(3).

Context learning and visual attention

Principal investigators: Nils Bergmann, Anna Schubö

In this project, which is part of the RTG 2271, we examine the impact of statistical learning of visual context information on attention. Contextual cueing experiments have demonstrated that learning of context configurations improves the efficiency in localizing a target (Chun & Jiang, 1998). We use contextual cueing tasks to investigate spontaneous learning of context regularities and its relation to attention guidance. A series of studies examines the influence of reward on context learning (e.g., Bergmann, Koch, & Schubö, 2019). Results show that the expectation of reward sensitizes participants to register repeating context information: it facilitates context learning and guides attention more efficiently to the target.

Related literature:

Bergmann, N., Tünnermann, J., & Schubö, A. (2020). Reward-predicting distractor orientations support contextual cueing: Persistent effects in homogeneous distractor contexts. Vision Research, 171, 53–63.

Bergmann, N., Koch, D., & Schubö, A. (2019). Reward expectation facilitates context learning and attentional guidance in visual search. Journal of Vision 19(3), 1-18.

Feldmann-Wüstefeld, T., & Schubö, A. (2014). Stimulus homogeneity enhances implicit learning: evidence from contextual cueing. Vision Research, 97, 108–116.

Schankin, A., & Schubö, A. (2010). Contextual cueing effects despite of spatially cued target locations. Psychophysiology, 47, 717–727.

Attentional control in the context of social perception: Victim sensitivity and eye movements

Principal investigators: Nils Bergmann, Merle Buchholz, Mario Gollwitzer, Anna Schubö

This cooperation project is a collaboration between project no. 04: “Social-cognitive processes underlying the persistence of (un)trustworthiness expectations”, and project no. 09: "Expectation and selective attention" and part of the RTG 2271.  The project investigates the effects of (un)trustworthiness expectations on eye movements, and examines the impact of victim sensitivity on visual information processing. Previous research in the field of victim sensitivity has largely neglected attentional processes that take place during confrontations with injustice. Our project fills this research gap by directly assessing the attentional mechanisms underlying the processing of expectation-inconsistent trustworthiness information in victim-sensitive individuals. Attentional processes are measured with eye tracking during confirmations and violations of (un)trustworthiness expectations. In a subsequent step, we compare eye movements across participants with various degrees of victim sensitivity.