Opendata, web and dolomites

ACTICIPATE

Action understanding in human and robot dyadic interaction

Total Cost €

0

EC-Contrib. €

0

Partnership

0

Views

0

Project "ACTICIPATE" data sheet

The following table provides information about the project.

Coordinator
INSTITUTO SUPERIOR TECNICO 

Organization address
address: AVENIDA ROVISCO PAIS 1
city: LISBOA
postcode: 1049-001
website: www.ist.utl.pt

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country Portugal [PT]
 Project website http://acticipate.eu/
 Total cost 100˙397 €
 EC max contribution 100˙397 € (100%)
 Programme 1. H2020-EU.1.3.2. (Nurturing excellence by means of cross-border and cross-sector mobility)
 Code Call H2020-MSCA-IF-2016
 Funding Scheme MSCA-IF-EF-ST
 Starting year 2017
 Duration (year-month-day) from 2017-06-01   to  2018-08-31

 Partnership

Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    INSTITUTO SUPERIOR TECNICO PT (LISBOA) coordinator 100˙397.00

Map

 Project objective

Humans have fascinating skills for grasping and manipulation of objects, even in complex, dynamic environments, and execute coordinated movements of the head, eyes, arms, and hands, in order to accomplish everyday tasks. When working on a shared space, during dyadic interaction tasks, humans engage in non-verbal communication, by understanding and anticipating the actions of working partners, and coupling their actions in a meaningful way. The key to this mind-boggling performance is two-fold: (i) a capacity to adapt and plan the motion according to unexpected events in the environment, (ii) and the use of a common motor repertoire and action model, to understand and anticipate the actions and intentions of others as if they were our own. Despite decades of progress, robots are still far from the level of performance that would enable them to work with humans in routine activities. ACTICIPATE addresses the challenge of designing robots that can share workspaces and co-work with humans. We rely on human experiments to learn a model/controller that allows a humanoid to generate and adapt its upper body motion, in dynamic environments, during reaching and manipulation tasks, and to understand, predict and anticipate the actions of a human co-worker, as needed in manufacturing, assistive and service robotics, and domestic applications. These application scenarios call for three main capabilities that will be tackled in ACTICIPATE: a motion generation mechanism (primitives), with a built-in capacity for instant reaction to changes in dynamic environments; a framework to combine primitives and execute coordinated movements of head, eyes, arm and hand, in a way similar (thus predictable) to human movements, and model the action/movement coupling between co-workers in dyadic interaction tasks; and the ability to understand and anticipate human actions, based on a common motor system/model that is also used to synthesize the robot’s goal-directed actions in a natural way.

 Publications

year authors and title journal last update
List of publications.
2018 Mirko Raković, Nuno Duarte, Jovica Tasevski, José Santos-Victor, Branislav Borovac
A dataset of head and eye gaze during dyadic interaction task for modeling robot gaze behavior
published pages: 3002, ISSN: 2261-236X, DOI: 10.1051/matecconf/201816103002
MATEC Web of Conferences 161 2019-05-28
2018 Nuno Duarte, Mirko Rakovic, Jorge Marques, Jose Santos-Victor
Action Anticipation by Predicting Future Dynamic Images
published pages: , ISSN: , DOI:
European Conference on Computer Vision ECCV 2018, Anticipating Human Behavior 2019-05-28
2018 Nuno Ferreira Duarte, Mirko Rakovic, Jovica Tasevski, Moreno Ignazio Coco, Aude Billard, Jose Santos-Victor
Action Anticipation: Reading the Intentions of Humans and Robots
published pages: 4132-4139, ISSN: 2377-3766, DOI: 10.1109/LRA.2018.2861569
IEEE Robotics and Automation Letters 3/4 2019-05-28
2018 Mirko Raković, Govind Anil, Živorad Mihajlović, Srđjan Savić, Siddhata Naik, Branislav Borovac, Achim Gottscheber
Fuzzy position-velocity control of underactuated finger of FTN robot hand
published pages: 2723-2736, ISSN: 1064-1246, DOI: 10.3233/JIFS-17879
Journal of Intelligent & Fuzzy Systems 34/4 2019-05-28
2019 Mirko Raković, Nuno Duarte, Jorge Marques, Aude Billard and Jose Santos-Victor
Modeling the Gaze Dialogue: Non-verbal communication in Human-Human and Human-Robot Interaction
published pages: , ISSN: 2168-2267, DOI:
IEEE Transactions on Cybernetics To be submitted 2019-05-28

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "ACTICIPATE" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "ACTICIPATE" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.3.2.)

SingleCellAI (2019)

Deep-learning models of CRISPR-engineered cells define a rulebook of cellular transdifferentiation

Read More  

Comedy and Politics (2018)

The Comedy of Political Philosophy. Democratic Citizenship, Political Judgment, and Ideals in Political Practice.

Read More  

HSQG (2020)

Higher Spin Quantum Gravity: Lagrangian Formulations for Higher Spin Gravity and Their Applications

Read More