Main Content


Picture: A.Gloriani
Picture: A.Gloriani

Research agenda

Humans execute several eye movements per second, to project interesting objects onto the fovea, the central area of the retina that allows high-acuity vision. In our lab we study the interaction of eye movements and visual perception with respect to several aspects.

Integration of bottom-up and top-down processing

For each eye movement, our brain has to trade-off different competing bottom-up and top-down signals, such as visual salience, object recognition, plans and value. We investigate how these different signals are integrated for eye movement control.

Perceptual consequences of eye movements

The execution of an eye movement changes the visual input on the retina dramatically, but nevertheless we perceive the world as stable and homogeneous. Here we study how perception is modulated by the execution of eye movements and how visual information is maintained and integrated across eye movements.

Learning and optimization

Eye movements not only belong to the most frequent, but also to the most accurate and precise movements of humans. We study how learning mechanisms maintain this high performance and how eye movements can be optimized for different perceptual tasks.

Individual differences in perceptual preferences

The visual system is often confronted with ambiguous visual input and has to choose between different possible interpretations of the sensory evidence at hand. Previous research showed that humans often exhibit perceptual preferences for specific interpretations. Here we study how these preferences are acquired and how they differ between individuals.




In our labs we study the interaction of eye movements and visual perception. We measure the eye movements of our participants while they are  observing objects on a monitor and perform oculomotor and/or visual tasks. The eye trackers we use emit infrared light and measure its reflectance. Because the pupil reflects hardly any light, the eye tracker will detect a dark spot and interpret it as the pupil. In order to identify where people are looking on the screen, we run a calibration procedure during which participants have to fixate a sequence of dots on the screen. This allows to identify the gaze
position on the monitor.

Equipment in the small laboratory

Desktop-mounted EyeLink 1000+ (SR Research)
ViewPixx monitor (VPixx)

Equipment in the large laboratory

Tower-mounted EyeLink 1000+ (SR Research)
ProPixx projector (VPixx)
Rear projection screen (Stewart)