Opendata, web and dolomites

Report

Teaser, summary, work performed and final results

Periodic Reporting for period 3 - DynaSens (Understanding the neural mechanisms of multisensory perception based on computational principles)

Teaser

The fact that we are equipped with multiple sensory modalities, such as vision, hearing or touch, provides considerable benefits for perception. Depending on the current needs and the composition of the sensory environment, we can selectively integrate information across the...

Summary

The fact that we are equipped with multiple sensory modalities, such as vision, hearing or touch, provides considerable benefits for perception. Depending on the current needs and the composition of the sensory environment, we can selectively integrate information across the different senses. For example, the adaptive combination of multiple sensory inputs increases our ability to identify objects in a sensory scene or to communicate with one another in a noisy environment. At the same time, our senses can fool us when information is incorrectly combined across the senses, such as for example during the ventriloquist illusion. While our brain efficiently handles these multiple sensory inputs we still have a limited understanding of the neural mechanisms that underlie multisensory integration. This project seeks to advance our knowledge of how the brain processes its sensory environment by linking the underlying brain mechanisms with specific multisensory computations and with perception.
Understanding the neural mechanisms underlying multisensory integration is important not only to reveal how the brain creates a coherent and unified percept of our environment, but also to pinpoint mechanisms that may be relevant for perceptual deficits that emerge during aging or in specific clinical conditions. For example, it seems that specific deficits in multisensory - more so than with unisensory - perception emerge in the elderly and are present in individuals with autism. Furthermore, knowledge about the perceptual and neural principles underlying multisensory processing is also relevant for the design of human-computer interfaces, as these pose specific challenges for the combination of multiple sensory cues from the environment.
This project aims to advance our basic understanding of the neural mechanisms underlying multisensory integration by addressing the following timely questions: What are the neural processes transforming multiple sensory inputs to a unified representation guiding behaviour? How does the brain control the dynamic weighting of multiple inputs and assigns these to either a single or multiple causes? Which perceptual and neural processes are affected in the multisensory deficits seen in autistic individuals or the elderly? We address these by combining neuroimaging in human volunteers with advanced statistical analysis of brain activity in conjunction with modelling approaches to link brain activity to specific processes involved in multisensory integration.
We hope that by following this agenda we can provide a more principled and comprehensive understanding of how the brain handles and merges multiple sensory inputs, and pave the way for a framework for addressing pressing problems associated with multisensory perceptual deficits seen in cognitive disorders and during our life span.

Work performed

Our ongoing work focuses on the brain mechanisms underlying the dynamic weighting of sensory information that arrives in the auditory and visual senses. We focus on different sensory scenarios, defined by different types of stimuli. So far we have implemented studies that involve visual and acoustic motion cues, judgements about the location of brief stimuli, ratings of temporal flicker, as well as speech. For each kind of stimulus we implemented psychophysical tasks that require the judgements about specific sensory attributes, such as locating a stimulus, judging the temporal order of stimuli, or the comprehension of a specific word. We then combined these behavioural tasks with high-resolution neuroimaging of brain activity, as implemented either via Electroencephalography or Magnetoencephalography, which both provide non-invasive measurements of electric brain activity. To analyse these measurements of brain activity, and to extract and disentangle different neural processes involved in either the analysis of uni-sensory (sensory-specific) information or in the merging sensory information, we used sophisticated statistical and model-based analysis. To achieve such an ambitious aim, we have tested and advanced three strands of analysis methodologies that can link single trial measures of brain activity with either specific sensory attributes, model-based predictions about sensory integration process, or with the participant’s behavioural choice on each trial. We are currently applying these to the different sensory paradigms under consideration to disentangle processes mediating cue integration from other uni- and multi-sensory processes, and linked to choice-related activity – in order to map the transformation from sensory inputs to perception. This should provide a detailed understanding of the neural mechanisms underlying cue integration (‘where’ and ‘how’). The results so far reveal a cascade of processes involved in the flexible use of multisensory information, starting with the analysis of unisensory information in sensory-specific brain regions, continuing with the computation of a merged representation of multisensory information in parietal cortex, and culminating in the flexible selection or combination of these representations in the frontal lobe. Thereby our results reconcile a number of previous proposal for how the flexible use of multisensory information is established in the brain, and demonstrate how multisensory integration, implemented in sensory and parietal regions, is under cognitive control from prefrontal cortex. Furthermore, we investigated how the composition of the multisensory environment at one point in time affects how we subsequently judge other uni- or multi-sensory stimuli. We found two sources of such a sequential control over multisensory integration; one implemented in parietal regions, where persistent memory from a previous stimulus influences how we perceive a subsequent stimulus, regardless of the presented modality, and another, in prefrontal regions, where the overall congruency of the sensory information on a previous trial effects subsequent behavior. All in all these results begin to dissociate neural processes implementing sensory integration itself from those that control whether, when and how sensory information is combined between modalities, or sequentially over time.

Final results

By comparing the mechanisms underlying the flexible use of multisensory information across different types of sensory stimuli and different tasks we begin to see and disentangle common mechanistic patterns as well task- or stimulus- specific processes, an issue that has often been neglected in previous work. In particular, many previous studies have directly, or implicitly, assumed that sensory information is always combined when available. However, in our studies we account for the flexibility of human observers to combine sensory information when meaningful and of benefit, and to refrain from combining sensory evidence that does not seem to belong together. Accounting for this flexibility of individual observers has effectively rendered our approach more complex, but at the same time, also more powerful. By linking neural processes implementing sensory perception with computation models of multisensory perception we have been able to directly link local brain activity with sensory-specific and mechanistically interpretable computations. This has allowed us to dissociate processes implement the combination of sensory information from those controlling this integration process based on task demands or other contextual evidence. In general, unravelling the cascade of uni- and multisensory processes underlying perception will pave the way to pinpoint those that are affected in conditions where sensory integration fails and hence behavioural performance declines, such as in the elderly or specific clinical conditions. We will directly address this by comparing young healthy and for example healthy elderly individuals in how they combine multisensory information in different experimental conditions, to gain insights into whether or how the combination of sensory information changes with age.

Website & more info

More info: http://www.uni-bielefeld.de/biologie/cns/.