Opendata, web and dolomites

Levitate SIGNED

Levitation with localised tactile and audio feedback for mid-air interactions

Total Cost €

0

EC-Contrib. €

0

Partnership

0

Views

0

Project "Levitate" data sheet

The following table provides information about the project.

Coordinator
UNIVERSITY OF GLASGOW 

Organization address
address: UNIVERSITY AVENUE
city: GLASGOW
postcode: G12 8QQ
website: www.gla.ac.uk

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country United Kingdom [UK]
 Project website http://www.levitateproject.org
 Total cost 2˙999˙870 €
 EC max contribution 2˙999˙870 € (100%)
 Programme 1. H2020-EU.1.2.1. (FET Open)
 Code Call H2020-FETOPEN-1-2016-2017
 Funding Scheme RIA
 Starting year 2017
 Duration (year-month-day) from 2017-01-01   to  2020-12-31

 Partnership

Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    UNIVERSITY OF GLASGOW UK (GLASGOW) coordinator 705˙685.00
2    THE UNIVERSITY OF SUSSEX UK (BRIGHTON) participant 669˙365.00
3    CHALMERS TEKNISKA HOEGSKOLA AB SE (GOETEBORG) participant 529˙750.00
4    UNIVERSITAT BAYREUTH DE (BAYREUTH) participant 524˙036.00
5    ULTRAHAPTICS LIMITED UK (BRISTOL) participant 487˙820.00
6    AARHUS UNIVERSITET DK (AARHUS C) participant 83˙213.00

Map

 Project objective

'This project will be the first to create, prototype and evaluate a radically new human-computer interaction paradigm that empowers the unadorned user to reach into levitating matter, see it, feel it, manipulate it and hear it. Our users can interact with the system in a walk-up-and-use manner without any user instrumentation.

As we are moving away from keyboards and mice to touch and touchless interactions, ironically, the main limit is the lack of any physicality and co-located feedback. In this project, we propose a highly novel vision of bringing the physical interface to the user in mid-air. In our vision, the computer can control the existence, form, and appearance of complex levitating objects composed of 'levitating atoms'. Users can reach into the levitating matter, feel it, manipulate it, and hear how they deform it with all feedback originating from the levitating object's position in mid-air, as it would with objects in real life. This will completely change how people use technology as it will be the first time that they can interact with technology in the same way they would with real objects in their natural environment.

We will draw on our understanding of acoustics to implement all of the components in a radically new approach. In particular, we will draw on ultrasound beam-forming and manipulation techniques to create acoustic forces that can levitate particles and to provide directional audio cues. By using a phased array of ultrasound transducers, the team will create levitating objects that can be individually controlled and at the same time create tactile feedback when the user manipulates these levitating objects. We will then demonstrate that the levitating atoms can each become sound sources through the use of parametric audio with our ultrasound array serving as the carrier of the audible sound. We will visually project onto the objects to create a rich multimodal display floating in space.'

 Deliverables

List of deliverables.
Database of broadband radiation characteristics across multiple transducer types Documents, reports 2020-02-27 11:38:14
\"Organization of demonstrators and workshops as part of outreach activities #3\" Demonstrators, pilots, prototypes 2020-02-27 11:38:14
Catalogue of effective haptic experiences, together with guidelines on how to create them Documents, reports 2020-02-27 11:38:14
\"Organization of demonstrators and workshops as part of outreach activities #2\" Demonstrators, pilots, prototypes 2020-02-27 11:38:14
A working prototype of the transducer array with haptics + audio + visual Demonstrators, pilots, prototypes 2020-02-27 11:38:14
Data managment plan Open Research Data Pilot 2020-02-27 11:38:14
Website and logo. weekly social media tweets/updates Websites, patent fillings, videos etc. 2020-02-27 11:38:14
\"Organization of demonstrators and workshops as part of outreach activities #1\" Demonstrators, pilots, prototypes 2020-02-27 11:38:14
Python toolbox of sound field/beamformer design methods Other 2020-02-27 11:38:14
Demonstrator v1 illustrating interaction techniques for atomic selection action Demonstrators, pilots, prototypes 2020-02-27 11:38:14

Take a look to the deliverables list in detail:  detailed list of Levitate deliverables.

 Publications

year authors and title journal last update
List of publications.
2020 Sofia Seinfeld, Tiare Feuchtner, Antonella Maselli, Jörg Müller
User Representations in Human-Computer Interaction
published pages: 1-39, ISSN: 0737-0024, DOI: 10.1080/07370024.2020.1724790
Human–Computer Interaction 2020 2020-03-24
2019 Ryuji Hirayama, Diego Martinez Plasencia, Nobuyuki Masuda, Sriram Subramanian
A volumetric display for visual, tactile and audio presentation using acoustic trapping
published pages: 320-323, ISSN: 0028-0836, DOI: 10.1038/s41586-019-1739-5
Nature 575/7782 2020-03-24
2018 Michele Iodice, William Frier, William Frier, James Wilcox, Ben Long, Orestis Georgiou
Pulsed schlieren imaging of ultrasonic haptics and levitation using phased arrays
published pages: , ISSN: , DOI:
Proceedings of the ICSV 2018 2018 2020-02-27

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "LEVITATE" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "LEVITATE" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.2.1.)

FRINGE (2019)

Fluorescence and Reactive oxygen Intermediates by Neutron Generated electronic Excitation as a foundation for radically new cancer therapies

Read More  

SOLQC (2019)

Synthetic Oligonucleotide Quality Control Software

Read More  

ATEMPGRAD (2019)

Analysing Temperature Effects with a Mobile and Precise Gradient Device

Read More