Opendata, web and dolomites

H-Reality SIGNED

Mixed Haptic Feedback for Mid-Air Interactions in Virtual and Augmented Realities

Total Cost €

0

EC-Contrib. €

0

Partnership

0

Views

0

 H-Reality project word cloud

Explore the words cloud of the H-Reality project. It provides you a very rough idea of what is the project "H-Reality" about.

gestures    touch    rings    apps    landscape    speech    strolling    unintuitive    experiences    revolutionary    vision    last    ar    rotational    textures    haptics    materials    feedback    integrating    manipulation    imbue    implications    media    sense    ultrasonic    first    stimulation    space    realities    sight    made    tribological    surface    contrast    physical    skin    false    skills    wearable    reality    intuitive    dynamics    operated    sensory    truth    files    ultimately    directional    reaching    untethered    psychophysical    delivering    margaret    vr    rendering    interactions    objects    experts    thin    pioneers    content    paramount    icons    hollow    distinguished    object    ultrasound    modulated    language    generation    virtually    air    feel    transform    swipe    mathematical    online    3d    computer    significantly    shapes    informing    renderings    computational    atwood    data    actuators    auditory    virtual    hone    tells    commercial    augmented    interfaces    sound    sensation    ready    visual    desktop    home    contact    always    vibrotactile    digital    human    dangerous    realm    mechanics    dimension    screen    haptic    machinery    safety    surgeons    manifest    graphical    instinctive    ambition   

Project "H-Reality" data sheet

The following table provides information about the project.

Coordinator
THE UNIVERSITY OF BIRMINGHAM 

Organization address
address: Edgbaston
city: BIRMINGHAM
postcode: B15 2TT
website: www.bham.ac.uk

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country United Kingdom [UK]
 Project website https://www.hreality.eu/
 Total cost 2˙994˙965 €
 EC max contribution 2˙994˙965 € (100%)
 Programme 1. H2020-EU.1.2.1. (FET Open)
 Code Call H2020-FETOPEN-1-2016-2017
 Funding Scheme RIA
 Starting year 2018
 Duration (year-month-day) from 2018-10-01   to  2021-09-30

 Partnership

Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    THE UNIVERSITY OF BIRMINGHAM UK (BIRMINGHAM) coordinator 806˙957.00
2    CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS FR (PARIS) participant 598˙420.00
3    ACTRONIKA FR (PARIS) participant 597˙742.00
4    ULTRAHAPTICS LIMITED UK (BRISTOL) participant 593˙350.00
5    TECHNISCHE UNIVERSITEIT DELFT NL (DELFT) participant 398˙495.00

Map

 Project objective

“Touch comes before sight, before speech. It is the first language and the last, and it always tells the truth” (Margaret Atwood), yet digital content today remains focused on visual and auditory stimulation. Even in the realm of VR and AR, sight and sound remain paramount. In contrast, methods for delivering haptic (sense of touch) feedback in commercial media are significantly less advanced than graphical and auditory feedback. Yet without a sense of touch, experiences ultimately feel hollow, virtual realities feel false, and Human-Computer Interfaces become unintuitive. Our vision is to be the first to imbue virtual objects with a physical presence, providing a revolutionary, untethered, virtual-haptic reality: H-Reality. The ambition of H-Reality will be achieved by integrating the commercial pioneers of ultrasonic “non-contact” haptics, state-of-the-art vibrotactile actuators, novel mathematical and tribological modelling of the skin and mechanics of touch, and experts in the psychophysical rendering of sensation. The result will be a sensory experience where digital 3D shapes and textures are made manifest in real space via modulated, focused, ultrasound, ready for the untethered hand to feel, where next-generation wearable haptic rings provide directional vibrotactile stimulation, informing users of an object's dynamics, and where computational renderings of specific materials can be distinguished via their surface properties. The implications of this technology will be far-reaching. The computer touch-screen will be brought into the third dimension so that swipe gestures will be augmented with instinctive rotational gestures, allowing intuitive manipulation of 3D data sets and strolling about the desktop as a virtual landscape of icons, apps and files. H-Reality will transform online interactions; dangerous machinery will be operated virtually from the safety of the home, and surgeons will hone their skills on thin air.

 Deliverables

List of deliverables.
Vibrotaction mechanism models Documents, reports 2020-02-18 11:08:25
\"Non-contact Haptic prototype #1\" Demonstrators, pilots, prototypes 2020-02-18 11:08:24
Demonstrations and Outreach activities Websites, patent fillings, videos etc. 2020-02-18 11:08:25
\"Non-contact Haptic prototype #2\" Demonstrators, pilots, prototypes 2020-02-18 11:08:25
Data management plan Documents, reports 2020-02-18 11:08:24
Perceptual limits for materials and objects Documents, reports 2020-02-18 11:08:24
Website/social media and logo Demonstrators, pilots, prototypes 2020-02-18 11:08:24

Take a look to the deliverables list in detail:  detailed list of H-Reality deliverables.

 Publications

year authors and title journal last update
List of publications.
2020 Thomas Howard, Maud Marchal, Anatole Lecuyer, Claudio Pacchierotti
PUMAH: Pan-Tilt Ultrasound Mid-Air Haptics for Larger Interaction Workspace in Virtual Reality
published pages: 38-44, ISSN: 1939-1412, DOI: 10.1109/TOH.2019.2963028
IEEE Transactions on Haptics 13/1 2020-04-01
2020 Steeven Villa Salazar, Claudio Pacchierotti, Xavier de Tinguy, Anderson Maciel, Maud Marchal
Altering the Stiffness, Friction, and Shape Perception of Tangible Objects in Virtual Reality Using Wearable Haptics
published pages: 167-174, ISSN: 1939-1412, DOI: 10.1109/TOH.2020.2967389
IEEE Transactions on Haptics 13/1 2020-04-01

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "H-REALITY" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "H-REALITY" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.2.1.)

RSENSE (2020)

Revolutionizing disease and environmental detection with portable optoacoustic sensing

Read More  

COMMER-CELL (2019)

Commercialisation of neuronal cell co-cultures

Read More  

SCAFFOLD-NEEDS (2019)

Commercialization of 3D scaffold platforms for neuronal cell culture models

Read More