Opendata, web and dolomites


An open or closed process: Determining the global scheme of perception

Total Cost €


EC-Contrib. €






Project "HOWPER" data sheet

The following table provides information about the project.


Organization address
address: HERZL STREET 234
postcode: 7610001

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country Israel [IL]
 Total cost 2˙493˙441 €
 EC max contribution 2˙493˙441 € (100%)
 Programme 1. H2020-EU.1.1. (EXCELLENT SCIENCE - European Research Council (ERC))
 Code Call ERC-2017-ADG
 Funding Scheme ERC-ADG
 Starting year 2018
 Duration (year-month-day) from 2018-06-01   to  2023-05-31


Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    WEIZMANN INSTITUTE OF SCIENCE IL (REHOVOT) coordinator 2˙493˙441.00


 Project objective

Despite decades of intensive research, there is no agreement about the general scheme of perception: Is the external object a trigger for a brain-internal process (open-loop perception, OLP) or is the object included in brain dynamics during the entire perceptual process (closed-loop perception, CLP)? HOWPER is designed to provide a definite answer to this question in the cases of human touch and vision. What enables this critical test is our development of an explicit CLP hypothesis, which will be contrasted, via specific testable predictions, with the OLP scheme. In the event that CLP is validated, HOWPER will introduce a radical paradigm shift in the study of perception, since almost all current experiments are guided, implicitly or explicitly, by the OLP scheme. If OLP is confirmed, HOWPER will provide the first formal affirmation for its superiority over CLP. Our approach in this novel paradigm is based on a triangle of interactive efforts comprising theory, analytical experiments, and synthetic experiments. The theoretical effort (WP1) will be based on the core theoretical framework already developed in our lab. The analytical experiments (WP2) will involve human perceivers. The synthetic experiments (WP3) will be performed on synthesized artificial perceivers. The fourth WP will exploit our novel rat-machine hybrid model for testing the neural applicability of the insights gained in the other WPs, whereas the fifth WP will translate our insights into novel visual-to-tactile sensory substitution algorithms. HOWPER is expected to either revolutionize or significantly advance the field of human perception, to greatly improve visual to tactile sensory substitution approaches and to contribute novel biomimetic algorithms for autonomous robotic agents.


year authors and title journal last update
List of publications.
2018 Michele Rucci, Ehud Ahissar, David Burr
Temporal Coding of Visual Space
published pages: 883-895, ISSN: 1364-6613, DOI: 10.1016/j.tics.2018.07.009
Trends in Cognitive Sciences 22/10 2020-03-05
2019 David Deutsch, Elad Schneidman, Ehud Ahissar
Generalization of Object Localization From Whiskers to Other Body Parts in Freely Moving Rats
published pages: , ISSN: 1662-5145, DOI: 10.3389/fnint.2019.00064
Frontiers in Integrative Neuroscience 13 2020-03-05

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "HOWPER" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email ( and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "HOWPER" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.1.)

FICOMOL (2019)

Field Control of Cold Molecular Collisions

Read More  


The Mass Politics of Disintegration

Read More  

Photopharm (2020)

Photopharmacology: From Academia toward the Clinic.

Read More