Opendata, web and dolomites


A Theory for Understanding, Designing, and Training Deep Learning Systems

Total Cost €


EC-Contrib. €






Project "THUNDEEP" data sheet

The following table provides information about the project.


Organization address
address: HERZL STREET 234
postcode: 7610001

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country Israel [IL]
 Total cost 1˙442˙360 €
 EC max contribution 1˙442˙360 € (100%)
 Programme 1. H2020-EU.1.1. (EXCELLENT SCIENCE - European Research Council (ERC))
 Code Call ERC-2017-STG
 Funding Scheme ERC-STG
 Starting year 2018
 Duration (year-month-day) from 2018-09-01   to  2023-08-31


Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    WEIZMANN INSTITUTE OF SCIENCE IL (REHOVOT) coordinator 1˙442˙360.00


 Project objective

The rise of deep learning, in the form of artificial neural networks, has been the most dramatic and important development in machine learning over the past decade. Much more than a merely academic topic, deep learning is currently being widely adopted in industry, placed inside commercial products, and is expected to play a key role in anticipated technological leaps such as autonomous driving and general-purpose artificial intelligence. However, our scientific understanding of deep learning is woefully incomplete. Most methods to design and train these systems are based on rules-of-thumb and heuristics, and there is a drastic theory-practice gap in our understanding of why these systems work in practice. We believe this poses a significant risk to the long-term health of the field, as well as an obstacle to widening the applicability of deep learning beyond what has been achieved with current methods.

Our goal is to tackle head-on this important problem, and develop principled tools for understanding, designing, and training deep learning systems, based on rigorous theoretical results.

Our approach is to focus on three inter-related sources of performance losses in neural networks learning: Their optimization error (that is, how to train a given network in a computationally efficient manner); their estimation error (how to ensure that training a network on a finite training set will ensure good performance on future examples); and their approximation error (how architectural choices of the networks affect the type of functions they can compute). For each of these problems, we show how recent advances allow us to effectively approach them, and describe concrete preliminary results and ideas, which will serve as starting points and indicate the feasibility of this challenging project.

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "THUNDEEP" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email ( and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "THUNDEEP" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.1.)

MOCHA (2019)

Understanding and leveraging ‘moments of change’ for pro-environmental behaviour shifts

Read More  

HelixMold (2019)

Computational design of novel functions in helical proteins by deviating from ideal geometries

Read More  


Dissecting hippocampal circuits for the encoding of early-life memories

Read More