Opendata, web and dolomites

Report

Teaser, summary, work performed and final results

Periodic Reporting for period 1 - Daphne (Circuits of Visual Attention)

Teaser

Due to the limited neuronal resources of any nervous system, extracting relevant sensory information from cluttered natural environments is critical to allocate computational power correctly. In this proposal, we explore the neuronal mechanisms used by the nervous system to...

Summary

Due to the limited neuronal resources of any nervous system, extracting relevant sensory information from cluttered natural environments is critical to allocate computational power correctly. In this proposal, we explore the neuronal mechanisms used by the nervous system to attend visual cues and thus, enable appropriate behaviors. To do so, we study the visual transformations of the mouse Superior Colliculus (SC), an evolutionarily conserved midbrain area known to process sensorimotor transformations and to be involved in the allocation of attention. By understanding the principles underlying sensorimotor transformation, our work will contribute to building a framework to study attention in health and disease.

The overall objectives of our project are (i) to provide a detailed description of visual representation in the SC, focusing on understanding how defined retinal information-streams, like motion and color, contribute to these properties; (ii) to understand the relationship between motor instructions and sensory coding; and (iii) to explore how higher brain areas can modulate sensory transformations within the SC.

Work performed

In the first part of our ERC project, we have focused on five parallel aspects, developing complementary approaches to study attention like processes — each required dedicated engineering of novel experimental paradigms.

The “projects” mentioned next are related to the ERC objectives in the following way:

Project I & III – Objective 1, Aim 1.1 and 1.2.
Project II – Objective 2, Aim 3.1 and 3.2 & Objective 3, Aim 3.1.
Project IV – Objective I, Aim 1.2.
Project V – Objective 2, Aim 2.1.

(I) Multi-photon imaging of the SC in awake behaving animals present evidence of dynamic saliency coding.

Here we developed a paradigm that allows us to image collicular activity in awake behaving animals immersed in a virtual reality arena. Great care was taken to improve the immersion of the animal in the virtual surrounding by developing a novel treadmill system, a head-clamping system, a visual stimulus matched to the mouse visual system and a co-registration of facial and eye movements (Fig. 1). Besides, we explored several different surgical procedures to improve access without damaging other brain areas and at the same time, improve brain motion stability. We have screened through different transgenic and viral approaches to improve the experimental paradigm. This system is currently up and running and allows to image collicular response properties in animals across days (Fig. 1D, E). Our imaging sessions last for up to 2hrs, a limit self-imposed to for the well-being of our animals. Our first preliminary results indicate that the superficial layers of the superior colliculus already show saliency coding that is dynamic, dependent only on the average statistics of the stimulus. Thus, cells in the superficial layer of the superior colliculus appear to adjusts its processing dynamics to enhance unexpected events.

Fig. 1. Awake behaving imaging experiments of visual responses in the SC. A. Experimental Set-up. Head fix mouse during recording. B. Example region of interest in the Superior Colliculus. C. Extracted Calcium traces. D. Example stimulus used to characterize flow filed responses in the SC. E. Example flow-fields and receptive field computed for a single SC neuron.

(II) Neuromodulation of Sensory Transformations.

We have established a closed-loop virtual environment arena that enables 360 degrees of visual stimulation, real-time tracking, and brain imaging of unrestrained freely walking animals (Fig. 2A, B). The overall goal of this set-up is to be able to study visual processing in a different context, e.g., hunting, that might be state-dependant and strongly modulated. Importantly, this set-up allows us to control all visual statistics of the environment in a closed-loop fashion. At the same time, we have improved the viral infection procedure and GRIN lens implantation in different Anterior-Posterior coordinates, to be able to infect and record defined population of neurons in the superficial SC (Fig. 2 C-F). The system is up and running, but we have encountered difficulties to have stable and reliable visual responses using endoscope imaging, something we are currently troubleshooting.

Within this project, we have been focusing on the neuromodulatory effect of the serotonergic system (Fig. 2G, H), since it has been implicated to directly modulating retinal information streams in rodents, is involved in modifying the neuronal representations of visual stimuli in zebrafish during states of hunger, which improves the representation of prey-like stimuli. Within this context, we have established a system that enables the direct modulation of the serotonergic system while monitoring its effects in sensory processing and behavior directly to study how state-dependant modulation can change sensory processing.

Currently, all pieces of this project have established and we are combining them.

Possible concerns: Although the endoscope recordings show neuronal responses of good quality, there are only a f

Final results

We have already established several independent but complementary approaches to understand sensory processing in the SC. Within these projects, we have shown to be capable of recording large-scale population dynamics in the retinas (up to ~4000 retinal ganglion cells simultaneously in one field of view). This large-scale approach that is relevant to understand subsequent specialization in the SC, allowing to compare (i) similarly large field of views in both tissues (retina and SC), and (ii) use this method in future experiments to functionally characterize specific inputs to collicular interneurons. Also, we have engineered a virtual reality arena that enables us to image, using a custom build multi-photon microscope up to 3 mm2, to up to >50 Hz. This technical accomplishment, combined with innovative treadmill system and visual stimulation procedures, enabled us to start questioning the predictive dynamics of collicular interneurons. Our preliminary result suggests that the neurons in the SC can quickly adapt to the average statistic of a scene and thus, enables the extraction of salient stimuli (stimuli with a change in statistics) in a cluttered background. Finally, we have engineered setups that allow (i) imaging the SC while animals are freely moving in a virtual environment and (ii) to record electrophysiologically from awake behaving animals. These paradigms will allow us to study state-dependent changes in behavioral coding and carefully characterize visual subthreshold response properties of visual interneurons, respectively. Finally, we are finishing a multi-photon system that should enable us to study multisensory and motor commands of deeper collicular layers. With these tools in hand, we expect to provide a detailed description of visual and multisensory representation (project I and V), a better understanding of sensory-driven attention and saliency (project I, III and IV) and finally mechanisms of state-dependent modulation of visual processing (project II).