Opendata, web and dolomites

AGILEFLIGHT SIGNED

Low-latency Perception and Action for Agile Vision-based Flight

Total Cost €

0

EC-Contrib. €

0

Partnership

0

Views

0

Project "AGILEFLIGHT" data sheet

The following table provides information about the project.

Coordinator
UNIVERSITAT ZURICH 

Organization address
address: RAMISTRASSE 71
city: ZURICH
postcode: 8006
website: n.a.

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country Switzerland [CH]
 Total cost 2˙000˙000 €
 EC max contribution 2˙000˙000 € (100%)
 Programme 1. H2020-EU.1.1. (EXCELLENT SCIENCE - European Research Council (ERC))
 Code Call ERC-2019-COG
 Funding Scheme ERC-COG
 Starting year 2020
 Duration (year-month-day) from 2020-07-01   to  2025-06-30

 Partnership

Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    UNIVERSITAT ZURICH CH (ZURICH) coordinator 2˙000˙000.00

Map

 Project objective

Drones are disrupting industries, such as agriculture, package delivery, inspection, and search and rescue. However, they are still either controlled by a human pilot or heavily rely on GPS for navigating autonomously. The alternative to GPS are onboard sensors, such as cameras: from the raw data, a local 3D map of the environment is built, which is then used to plan a safe trajectory to the goal. While the underlying algorithms are well understood, we are still far from having autonomous drones that can navigate through complex environments as good as human pilots. State-of-the-art perception and control algorithms are mature but not robust: coping with unreliable state estimation, low-latency perception, real-time planning in dynamic environments, and tight coupling of perception and action under severe resource constraints are all still unsolved research problems. Another issue is that, because battery energy density is increasing at a very slow rate, drones need to navigate faster in order to accomplish more within their limited flight time. To obtain more agile robots, we need faster sensors and low-latency processing.

The goal of this project is to develop novel scientific methods that would allow me to demonstrate autonomous, vision-based, agile quadrotor navigation in unknown, GPS-denied, and cluttered environments with possibly moving obstacles, which can be as effective in terms of maneuverability and agility as those of professional drone pilots. The outcome would not only be beneficial for disaster response scenarios, but also for other scenarios, such as aerial delivery or inspection. To achieve this ambitious goal, I will first develop robust, low-latency, multimodal perception algorithms that combine the advantages of standard cameras with event cameras. Then, I will develop novel methods that unify perception and state estimation together with planning and control to enable agile maneuvers through cluttered, unknown, and dynamic environments.

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "AGILEFLIGHT" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "AGILEFLIGHT" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.1.)

E-DIRECT (2020)

Evolution of Direct Reciprocity in Complex Environments

Read More  

DOUBLE-TROUBLE (2020)

Replaying the ‘genome duplication’ tape of life: the importance of polyploidy for adaptation in a changing environment

Read More  

Growth regulation (2019)

The wide-spread bacterial toxin delivery systems and their role in multicellularity

Read More