Explore the words cloud of the AMPLIFY project. It provides you a very rough idea of what is the project "AMPLIFY" about.
The following table provides information about the project.
|Coordinator Country||Germany [DE]|
|Total cost||1˙925˙250 €|
|EC max contribution||1˙925˙250 € (100%)|
1. H2020-EU.1.1. (EXCELLENT SCIENCE - European Research Council (ERC))
|Duration (year-month-day)||from 2016-07-01 to 2021-06-30|
Take a look of project's partnership.
|1||LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN||DE (MUENCHEN)||coordinator||1˙267˙750.00|
|2||UNIVERSITAET STUTTGART||DE (STUTTGART)||participant||657˙500.00|
Current technical sensor systems offer capabilities that are superior to human perception. Cameras can capture a spectrum that is wider than visible light, high-speed cameras can show movements that are invisible to the human eye, and directional microphones can pick up sounds at long distances. The vision of this project is to lay a foundation for the creation of digital technologies that provide novel sensory experiences and new perceptual capabilities for humans that are natural and intuitive to use. In a first step, the project will assess the feasibility of creating artificial human senses that provide new perceptual channels to the human mind, without increasing the experienced cognitive load. A particular focus is on creating intuitive and natural control mechanisms for amplified senses using eye gaze, muscle activity, and brain signals. Through the creation of a prototype that provides mildly unpleasant stimulations in response to perceived information, the feasibility of implementing an artificial reflex will be experimentally explored. The project will quantify the effectiveness of new senses and artificial perceptual aids compared to the baseline of unaugmented perception. The overall objective is to systematically research, explore, and model new means for increasing the human intake of information in order to lay the foundation for new and improved human senses enabled through digital technologies and to enable artificial reflexes. The ground-breaking contributions of this project are (1) to demonstrate the feasibility of reliably implementing amplified senses and new perceptual capabilities, (2) to prove the possibility of creating an artificial reflex, (3) to provide an example implementation of amplified cognition that is empirically validated, and (4) to develop models, concepts, components, and platforms that will enable and ease the creation of interactive systems that measurably increase human perceptual capabilities.
|year||authors and title||journal||last update|
Pascal Knierim, Thomas Kosch, Matthias Hoppe, Albrecht Schmidt
Challenges and Opportunities of Mixed Reality Systems in Education
published pages: 325-330, ISSN: , DOI: 10.18420/muc2018-ws07-0471
|Mensch und Computer 2018 - Workshopband||2019-08-29|
Albrecht Schmidt, Thomas Herrmann
Intervention user interfaces
published pages: 40-45, ISSN: 1072-5520, DOI: 10.1145/3121357
Stefan Schneegass, Albrecht Schmidt, Max Pfeiffer
Creating user interfaces with electrical muscle stimulation
published pages: 74-77, ISSN: 1072-5520, DOI: 10.1145/3019606
Thomas Kosch, Markus Funk, Albrecht Schmidt, Lewis L. Chuang
Identifying Cognitive Assistance with Mobile Electroencephalography
published pages: 1-20, ISSN: 2573-0142, DOI: 10.1145/3229093
|Proceedings of the ACM on Human-Computer Interaction 2/EICS||2019-04-18|
Are you the coordinator (or a participant) of this project? Plaese send me more information about the "AMPLIFY" project.
For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.
Send me an email (firstname.lastname@example.org) and I put them in your project's page as son as possible.
Thanks. And then put a link of this page into your project's website.
The information about "AMPLIFY" are provided by the European Opendata Portal: CORDIS opendata.