Explore the words cloud of the eyecontrol project. It provides you a very rough idea of what is the project "eyecontrol" about.
The following table provides information about the project.
EYEFREE ASSISTING COMMUNICATION LTD
|Coordinator Country||Israel [IL]|
|Total cost||2˙843˙250 €|
|EC max contribution||1˙990˙275 € (70%)|
1. H2020-EU.2.1.1. (INDUSTRIAL LEADERSHIP - Leadership in enabling and industrial technologies - Information and Communication Technologies (ICT))
2. H2020-EU.2.3.1. (Mainstreaming SME support, especially through a dedicated instrument)
|Duration (year-month-day)||from 2018-03-01 to 2020-02-29|
Take a look of project's partnership.
|1||EYEFREE ASSISTING COMMUNICATION LTD||IL (TEL AVIV JAFFA)||coordinator||1˙990˙275.00|
Today, there are more than 1.2 Million ALS and other “locked-in” patients who cannot move or speak (due to a stroke, neurological diseases, spinal injuries, etc.). Those patients have fully functioning cognitive abilities, but they are locked in their bodies. EyeFree Assisting Communication Ltd. (EyeFree) plans on disruption of the ALS and locked-in assistance devices by providing EyeControl, a wearable and simple to use communication device that offers the main required communication functionality for much less than the currently used devices (supplied mainly by Tobii). The leading communication devices today are screen based, controlled by eye gazing on a screen, therefore requiring long training and limiting the ability of the user to speak only when the patient is in front of the screen and the system is calibrated to the screen position! EyeControl was developed by founders related to ALS patients in order to change the reality of the “locked- in” patients by enabling any patient to communicate anytime, anywhere. The EyeControl follows the user eye movements using an infrared camera; the camera translates the movements of the pupil into a “joystick” and transmits them to a small computer that converts the movements into actions. With EyeControl, we don’t use a screen; instead we rely on audio feedback, so that the user hears what he wants to say instead of seeing it on a screen. The user has a bone conduction earphone that enables him to hear what he wants to say before transmitting it. The device enables three levels of communication ranging from a basic call for help to more complex participation in a full conversation. The device can also connect to a smart phone/tablet via Bluetooth. Additionally, it enables the user to hear his own language in his earphone and transmit through the speaker in another language. This ability allows the user to communicate with a care giver that may speak a different language.
Work performed, outcomes and results: advancements report(s)
Are you the coordinator (or a participant) of this project? Plaese send me more information about the "EYECONTROL" project.
For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.
Send me an email (firstname.lastname@example.org) and I put them in your project's page as son as possible.
Thanks. And then put a link of this page into your project's website.
The information about "EYECONTROL" are provided by the European Opendata Portal: CORDIS opendata.
BioAhead – an innovative optimization algorithm for amino acid chain analyticsRead More
Ultra-secure data storage - and long-term preservation of digital and analogue dataRead More
ScanVid - One-Click Integrated Access to Product-related Digital ContentRead More