Opendata, web and dolomites

Report

Teaser, summary, work performed and final results

Periodic Reporting for period 2 - InteractiveSkin (InteractiveSkin: Digital Fabrication of Personalized On-Body User Interfaces)

Teaser

User interfaces are moving onto the human body. However, today’s rigid and mass-fabricated devices do not conform closely to the body, nor are they customized to fit individual users. This drastically restricts their interactive capabilities. This project aims to lay the...

Summary

User interfaces are moving onto the human body. However, today’s rigid and mass-fabricated devices do not conform closely to the body, nor are they customized to fit individual users. This drastically restricts their interactive capabilities.
This project aims to lay the foundations for a new generation of body-worn UIs: interactive skin. Our approach is unique in proposing computational design and rapid manufacturing of stretchable electronics as a means to customize on-body UIs. Our vision is that laypeople design highly personalized interactive skin devices in a software tool and then print them. Interactive skin has the advantage of being very thin, stretchable, of custom geometry, with embedded sensors and output components. This allows it to be used as highly conformal interactive patches on various body locations, for many mobility tasks, leveraging the many degrees of freedom of body interaction.
This vision requires ground-breaking contributions at the intersection of on-body interaction, stretchable electronics, and digital fabrication: 1) We will contribute an automatic method to generate printable electronic layouts for interactive skin from a high-level design specification. 2) We will contribute multimodal interaction primitives that address the unique challenges of skin interaction. 3) We will develop principles for design tools that allow end-users to easily design a personalized interactive skin device. 4) We will use the newly developed methodology to realize and empirically evaluate interactive skin in unsolved application cases.
The project will establish digital fabrication as a strong complement to existing mass-manufacturing of interactive devices. We will contribute to a deep and systematic understanding of the on-body interaction space and show how to build UIs with unprecedented body compatibility and interactive capabilities. We expect that our method will act as a key enabler for the next generation of body-UIs.

Work performed

The first project period has established the foundations for interactive skin in four main areas, with the following main results:
A) Computational fabrication of interactive skin
We developed functional designs for sensors and output components that can be embedded in interactive skin devices. These components are ultra-thin (typically between 1 and 50 microns) and elastic, so the device can closely conform to the user’s skin and can be ergonomically worn even on highly curved and stretchable body locations. As skin is inherently multi-modal and our goal is to support rich and expressive interfaces, our results contributed designs for a wide variety of components, including multi-touch, sliding, bending, squeezing, stretching, pulling. A highlight result is Tacttoo, the thinnest tactile matrix interface presented so far in the literature. It allows the user to feel real-world objects through the interface, while these can be augmented with computer-generated tactile output. Most of our designs can be readily printed on temporary tattoo paper using commercially available functional inks. This provides a very direct and rapid way for fabricating custom devices. Our vision is to ultimately print a personalized electronic devices similarly to how we today print a document. We developed an award-winning method that allows people to use an ordinary desktop inkjet printer to print functional interactive skin devices within minutes. This will help make the technology accessible to a much wider audience of user interface experts, designers and domain experts who can now start to investigate interactive skin devices without needing highly specialized knowledge and extensive lab equipment.
B) Interacting with interactive skin
Skin is a fundamentally different interaction surface than conventional touchscreens. A central aim of this project is to investigate how we can leverage the unique properties of skin for user interaction. For one, we proposed and investigated the concept of body landmarks: haptic or visual features of the human body, such as knuckles or wrinkles, that can be used to guide interaction on the skin. A set of interaction techniques and functional prototypes demonstrated the power of this principle. Second, we have investigated how the skin on user’s fingers can offer a surface for microgestures performed between fingers. Such gestures offer a rapid, versatile and discreet way of input in mobile contexts. We have conceptually explored the design space of these gestures, leading to an award-winning publication, and demonstrated new gestures for settings when hands are busy holding an object. Moreover, we have conducted in-depth investigations of deformation-based input and output. Their results can lead to new and better ways of interacting on the skin than through touch contact alone.
C) Design tools for personalized interactive skin
Our goal is to offer computational support for a wide audience that helps with designing personalized and customized interactive skin devices. This includes personalization to a specific user’s body proportions, customization to specific application contexts, to specific body locations or desired aesthetic goals. The prior state-of-the-art was to create interactive skin devices manually; this required expertise in design, electronics, materials and other fields. We aim at establishing the foundations for tools that algorithmically generate functional devices based on a high-level design specification. Working toward this overall goal we could make several pioneering contributions during the first project period. These include principles and the first design tool for generating a custom-shaped multi-touch sensor from only a high-level definition of the sensor’s size and shape. We furthermore have developed techniques for design tools of devices that contain customized tactile output. Finally, we have worked on algorithmic approaches that allow design tools to customize the material

Final results

See final scientific report

Website & more info

More info: https://hci.cs.uni-saarland.de/research/.