Opendata, web and dolomites

HumRobManip

Robotic Manipulation Planning for Human-Robot Collaboration on Forceful Manufacturing Tasks

Total Cost €

0

EC-Contrib. €

0

Partnership

0

Views

0

Project "HumRobManip" data sheet

The following table provides information about the project.

Coordinator
UNIVERSITY OF LEEDS 

Organization address
address: WOODHOUSE LANE
city: LEEDS
postcode: LS2 9JT
website: www.leeds.ac.uk

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country United Kingdom [UK]
 Total cost 195˙454 €
 EC max contribution 195˙454 € (100%)
 Programme 1. H2020-EU.1.3.2. (Nurturing excellence by means of cross-border and cross-sector mobility)
 Code Call H2020-MSCA-IF-2016
 Funding Scheme MSCA-IF-EF-RI
 Starting year 2017
 Duration (year-month-day) from 2017-05-01   to  2019-04-30

 Partnership

Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    UNIVERSITY OF LEEDS UK (LEEDS) coordinator 195˙454.00

Map

 Project objective

This proposal addresses robotic manipulation planning for human-robot collaboration during manufacturing. My objective is to develop a planning framework which will enable a team of robots to grasp, move and position manufacturing parts (e.g. planks of wood) such that a human can execute sequential forceful manufacturing operations (e.g. drilling, cutting) to build a product (e.g. a wooden table). The overall objective is divided into three components: First, I will develop a planning algorithm which, given the description of a manufacturing task, plans the actions of all robots in a human-robot team to perform the task. Second, I will develop probabilistic models of human interaction to be used by the planner. This model will include (i) an action model that assigns probabilities to different manufacturing operations (e.g. drilling a hole vs. cutting a piece off) as the next actions the human intends to do; (ii) a geometric model that assigns probabilities to human body postures; and (iii) a force model that assigns probabilities to force vectors as the predicted operational forces. Third, I will build a real robotic system to perform experiments and test my algorithm's capabilities. This system will consist of at least three robot manipulators. This fellowship will enable me to add a completely new human dimension to my planning research. I will work with Prof. Tony Cohn (supervisor) who is a world-leading expert in human activity recognition and prediction - a critical skill for the human-robot collaboration problem I intend to solve. From him and his group, I will receive training on tracking/predicting human posture and recognizing/predicting human activities using vision and point-cloud data. I will then integrate these tracking and prediction methods into a robotic planning framework to enable human-robot collaborative operations. This fellowship will help me to attain a permanent academic position and to become a leading researcher in robotic manipulation.

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "HUMROBMANIP" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "HUMROBMANIP" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.3.2.)

MultiSeaSpace (2019)

Developing a unified spatial modelling strategy that accounts for interactions between species at different marine trophic levels, and different types of survey data.

Read More  

MOSAiC (2019)

Multimode cOrrelations in microwave photonics with Superconducting quAntum Circuits

Read More  

E-CLIPS (2019)

Effects of Cross-Linguistic Interactions on Perception of Speech

Read More