Opendata, web and dolomites

Report

Teaser, summary, work performed and final results

Periodic Reporting for period 2 - socSMCs (Socialising Sensori-Motor Contingencies)

Teaser

In order to safely and meaningfully cooperate with humans, robots must be able to interact in ways that humans find intuitive and understandable. Therefore, robots must develop an advanced real-world social intelligence and be transparent with respect to the causes and reasons...

Summary

In order to safely and meaningfully cooperate with humans, robots must be able to interact in ways that humans find intuitive and understandable. Therefore, robots must develop an advanced real-world social intelligence and be transparent with respect to the causes and reasons for their actions to human users. The project addresses this challenge and proposes a novel approach for understanding and modeling social behavior and implementing social coupling in robots. Our approach is based on the view that even complex modes of social interaction are grounded in basic sensorimotor patterns that enable the dynamic coupling of agents. Sensorimotor patterns (or SensoriMotor Contingencies - SMCs) are known to be highly relevant in cognition.We use ‘socSMCs’ as a shorthand for such socially relevant action-effect contingencies. Our key hypothesis is that learning and mastery of these action-effect contingencies are also critical for predicting consequences of the action of others and, thus, to enable effective coupling of agents in social contexts. We will investigate socSMCs in human-human and human-robot social interaction scenarios. The main objectives of the project target towards three societal challenges: i) improving the social competencies of robots, ii) understanding the mechanisms of human social interaction, and iii) applying these insights for improving the condition of people with disturbances of social cognition. All objectives of the socSMCs approach will be benchmarked in several demonstrator scenarios. Our long term vision is to realize a new socially competent robot technology grounded in novel insights into mechanisms of functional and dysfunctional social behavior, and to test novel aspects and strategies for human-robot interaction and cooperation that can be applied in a multitude of assistive roles relying on highly compact computational solutions.

Work performed

WP1 – Concepts and models
- information-theoretic analyses of the relation between the robot’s visual input and its overall fitness, yielding relevant image regions with high predictive power for aversive events like, e.g., a collision.
- development of the empowerment formalism to deal with longer temporal horizons and the coupled interaction of several agents; evaluation in different, simulated scenarios
- exploring information-theoretic measures for analyzing the coupling in multi-agent systems; applying these measures for the evaluation of neurophysiological human data; identifying the level of entrainment in multi-agent systems
- work on reactive control loops from which social interactions and entrainment processes during coordination of joint action may build: turn taking, feedback control layer, reactive control of REEM-C robot for postural and walking behaviors
- relating socSMC concept to Distributed Adaptive Control theory

WP2 – Grounding social cognition in proximal socSMCs
- configuration of the PR2 robot to control both 7-DOF arms in position, velocity and torque modes; software development to track basic objects using stereo cameras and Kinect;
- EEG and eye tracker integration
- pilot study of the labyrinth game paradigm recording EEG, EMG, ECG and GSR; analysis of collaborative learning by mutual information between EMG signals; evaluating the degree to which learned action models generalize to previously unknown, but equally trained, partners
- development of an HRI scenario of stabilizing the position of a ball on a jointly held plank
- deep-learning networks for feature learning from raw images and audition generated by the sonification system from LUH

WP3 – Grounding social cognition in distal socSMCs
- integration of 2 force/torque sensors in REEM-C’s arms
- preparing a set of tools for tracking agents in real time in the eXperience Induction Machine
- development of a software stack for basic robot operation: self-balancing, self-collision detection and stabilization; studies of emergence of turn-taking behaviors
- development of an interest point motion analysis method for learning sensorimotor mappings from action exploration/exploitation process and observation of camera input
- development of an HRI scenario where the robot learns to control hand movements from human imprecise feedback.
- studies of attentional bottleneck and how to avoid it by distributing information processing across several sensory modalities

WP4 – socSMCs in disturbed social cognition
- adaptation of the labyrinth game for recordings in ASD patients in the MEG
- development of a virtual, distal version of the labyrinth game that prompts 2 players to develop strategies for coordination
- implementation of the pong scenario
- analysis of motion-flow data using mutual information

WP5 – Applications and demonstrators
- conceptual discussion on appropriate measures to reflect participant experience
- first version of the demonstrator in which a single person moves the whole robot through proximal interaction by guiding the hand of the robot with a handshake
- simulation of the robot remover scenario
- development of the computational basis for the for the selfie demonstrator
- application development for quick acquisition of eye-tracking-like data from participants online

WP6 – Dissemination and exploitation
- setup of the project\'s web presence
- publication output monitoring
- communication with several other consortia
- autumn school “The sensorimotor foundations of social cognition”
- publication of project output in high-ranked journals; risk monitoring
- planning exploitation of the results for PAL\'s lines of service robots

Final results

The project will contribute to a qualitative change in HRI and human-robot cooperation, unlocking new capabilities and application areas together with enhanced robustness and monitoring. It aims at investigating socSMCs in non-verbal social interaction scenarios. We explore to what degree such implicit mechanisms can be enhanced by cross-modal augmentation approaches using sonification and haptification of socially affording sensorimotor signals.
We test a novel model of social behavior, leading to novel ways of quantifying social coupling. In particular, we pioneer a new approach for testing the salience and effectiveness of social affordances in humans. This approach will define innovative strategies for realizing and benchmarking HRI scenarios.
Our approach will enable introducing new concepts for robotics overcoming divides in HRI, facilitating training of social abilities in ASD humans, and further generalization towards ambient assisted living, care giving, interactive social games and online scenarios. The sonification and haptification of socially affording stimuli and coupling patterns that we explore here may enhance social entrainment in both HHI and HRI scenarios.For example, work on distal socSMCs in joint attentional search could enhance the use of online resources, by facilitating search through shared attentional cues and augmented feedback. The socSMCs approach could also have impact on the development of assistive technology for social entrainment, which might be implemented e.g. as smartphone-based apps for interpersonal synchronization using sonification and online estimation of sensorimotor entrainment.

Website & more info

More info: http://s.eu.