Opendata, web and dolomites

FISHNAV SIGNED

Following a path of breadcrumbs: How fish recognize landmarks during navigation

Total Cost €

0

EC-Contrib. €

0

Partnership

0

Views

0

Project "FISHNAV" data sheet

The following table provides information about the project.

Coordinator
THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD 

Organization address
address: WELLINGTON SQUARE UNIVERSITY OFFICES
city: OXFORD
postcode: OX1 2JD
website: www.ox.ac.uk

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country United Kingdom [UK]
 Total cost 195˙454 €
 EC max contribution 195˙454 € (100%)
 Programme 1. H2020-EU.1.3.2. (Nurturing excellence by means of cross-border and cross-sector mobility)
 Code Call H2020-MSCA-IF-2014
 Funding Scheme MSCA-IF-EF-ST
 Starting year 2015
 Duration (year-month-day) from 2015-08-31   to  2020-10-08

 Partnership

Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD UK (OXFORD) coordinator 195˙454.00

Map

 Project objective

Reliable vision-based object recognition is of fundamental importance to a wide range of species; however, it can be difficult as the appearance of an object can vary greatly as a result of changes in viewpoint. Recognition during motion presents a particularly challenge as the appearance of an object continuously changes; a particular issue for animals that use landmarks to navigate. One recognition mechanism is to learn a two-dimensional snapshot of an object from a set viewpoint. The object can later be recognized once the appearance of the object matches the stored snapshot. Some animals reduce the number of required snapshots by employing ‘active vision’, where they follow identical routes between landmarks. For fish, the complexity of recognition is compounded by the fact that, unlike surface-bound animals, they can freely move vertically, which could potentially increase the number of approach views to an object. An alternate possibility is that fish have a ‘view invariant’ recognition system and can generalize learned representations of objects so that they can be recognized from different viewing angles. However, high-level visual functions, such as object recognition, are associated with complex mammalian brain structures and may be impossible for animals lacking similar neural circuitry. Despite these problems, we know that fish are capable of navigating efficiently using landmarks. The goal of this project is to investigate how fish recognize visual landmarks during navigation and to determine how they cope with self-orientation related changes in the appearance of objects during motion. Using behavioural experiments, we will test whether fish have view invariant recognition and/or if they employ active vision during navigation. The proposed project will further our knowledge of how fish perceive their visual environment as well as inform us about how conserved the mechanisms of object recognition are from an evolutionary perspective.

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "FISHNAV" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "FISHNAV" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.3.2.)

MultiSeaSpace (2019)

Developing a unified spatial modelling strategy that accounts for interactions between species at different marine trophic levels, and different types of survey data.

Read More  

ActinSensor (2019)

Identification and characterization of a novel damage sensor for cytoskeletal proteins in Drosophila

Read More  

LIGHTMATT-EXPLORER (2019)

Experimental determination of the paraxial-vectorial limit of light-matter interactions

Read More