Opendata, web and dolomites

OBJECTPERMOD SIGNED

Explaining object permanence with a deep recurrent neural network model of human cortical visual cognition

Total Cost €

0

EC-Contrib. €

0

Partnership

0

Views

0

Project "OBJECTPERMOD" data sheet

The following table provides information about the project.

Coordinator
UNIVERSITY OF GLASGOW 

Organization address
address: UNIVERSITY AVENUE
city: GLASGOW
postcode: G12 8QQ
website: www.gla.ac.uk

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country United Kingdom [UK]
 Total cost 271˙732 €
 EC max contribution 271˙732 € (100%)
 Programme 1. H2020-EU.1.3.2. (Nurturing excellence by means of cross-border and cross-sector mobility)
 Code Call H2020-MSCA-IF-2018
 Funding Scheme MSCA-IF-GF
 Starting year 2019
 Duration (year-month-day) from 2019-09-01   to  2022-08-31

 Partnership

Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    UNIVERSITY OF GLASGOW UK (GLASGOW) coordinator 271˙732.00
2    TRUSTEES OF COLUMBIA UNIVERSITY IN THE CITY OF NEW YORK US (NEW YORK) partner 0.00

Map

 Project objective

Visual cognition is our ability to recognize the things we see around us and make inferences about their meaning and relationships. Deep convolutional neuronal network (CNN) models now achieve human-level performance on certain visual recognition tasks and currently provide the most powerful models of human visual cognition. A hallmark step in the development of human visual cognition is the acquisition of object permanence (OP). Object permanence is the ability to continue to mentally represent an object that has disappeared from view – for example because it is hidden behind another object. Current deep neural network models of vision lack this fundamental ability, limiting their power as models of human visual cognition and as artificially intelligent systems. In this action, I will study the computational mechanisms necessary for OP using a highly innovative approach that combines four elements: (1) a novel behavioral task that requires OP, (2) development of a deep recurrent neural network models, (3) testing of both human participants and models at the task, and (4) measurement of brain activity with functional magnetic resonance imaging (fMRI) during task performance. The OP task involves viewing a scene of moving objects that occasionally become occluded behind other objects. Models will be trained to represent objects continually, even as they vanish behind an occluder, and selected to match behavioral and cortical-layer- resolved high-field fMRI data of human observers. The hosts, Prof Kriegeskorte at Columbia University and Prof Muckli at University of Glasgow are world-leading experts on deep neural network models of vision and cortical-layer-resolved highfield fMRI, respectively. The outcome of this action, a biologically plausible deep recurrent convolutional model that can explain behavior and brain activity, will significantly enhance our understanding of the computational principles of visual cognition, with implications also for AI technology.

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "OBJECTPERMOD" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "OBJECTPERMOD" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.3.2.)

MITafterVIT (2020)

Unravelling maintenance mechanisms of immune tolerance after termination of venom immunotherapy by means of clonal mast cell diseases

Read More  

NeuroTick (2019)

The neuroscience of tickling: cerebellar mechanisms and sensory prediction

Read More  

ProTeCT (2019)

Proteasome as a target to combat trichomoniasis

Read More