Skip to content

Visual Neuroscience & Perception

The fundamental research arm of our team tackle many topics that aim to help us to understand how the brain works to enable us to see form, movement, colour, 3-D etc. and how this interacts with other senses. Some of our current areas of research are described below.


The human electroretinogram (ERG) is an electrical signal that originates in the retina and can be measured non-invasively. Our research aims to provide a complete characterisation of contributions from the different photoreceptor types (rods, L-, M- and S-cones) to the ERG, thus providing a detailed insight into the functional integrity of the human retina.  

Our research focuses on the use of state-of-the-art LED stimulators which can be used to isolate the activity of individual photoreceptor populations. This allows us to study the different photoreceptor types in the retina, enabling us to develop improved stimuli that can be used in the diagnosis of retinal diseases such as rod and cone dystrophies.

Motion Trajectories & Memory

Our ability to process motion trajectories is fundamental to survival and to interacting with our natural environment. In our environment, animate and inanimate (e.g. cars, cricket ball) objects move and change (traffic lights, occlusion) and in order to interact with this environment we must remember and update our mental representation of this environment.  We use psychophysics and modelling to address questions such as:  

  • How many moving objects can we monitor at the same time, and how precisely can we monitor them? 
  • How is information transmitted and stored in the different stages of memory? 
  • How is memory for motion information influenced by factors such as attention? 
  • How do the findings for motion memory generalise to other properties of objects, such as their position or colour?  

Multisensory Perception 

Real-world events provide the brain with auditory, visual and tactile input. Often, a single event might generate all three input types simultaneously. Somehow, the brain must combine these inputs into a single coherent whole, allowing us to group information types with a common source. This process is known as multisensory integration. Our research examines the question of how the brain performs this feat despite the radical differences in receptor surface properties (the eye’s retina, the ear’s cochlear, the skin’s mechanoreceptors) that transduce this information. 

Time Perception

How does the brain tell time? There are two types of time perception that are the focus of our interest. One is the relative perceived timing of events. Specifically, how do we know that event A happened before event B? We use adaptation to manipulate perceived temporal order in ways that help us better understand how humans synchronise their perception with devices such as touch-screens. Our second area of time perception interest centres on perceived duration; the process of estimating how long events lasted. Why is it that ‘time flies when you are having fun’ and ‘a watched pot never boils’? There are specialist areas of the brain dedicated to the perception of spatial location, but nothing comparable for the perception of event duration – what does that tell us about the neural underpinning of temporal perception? Does this explain why our time perception is often distorted and variable? 

Visual Psychophysics & Computational vision

This area of research aims to understand the basic mechanisms of human vision. Research topics include how humans detect edges, search for and recognise objects in a cluttered environment and process features of everyday images such as contrast, blur and colour.  We use psychophysics and computational modelling to develop better understanding and improved theoretical models of how the human visual system takes light arriving at the eye from the world around us and processes this into a coherent percept, enabling us to perform the daily tasks that we take for granted such as locating and recognising objects.