MARIE

Multimodal Activity Recognition for Interactive Environments

 Coordinatore LANCASTER UNIVERSITY 

 Organization address address: BAILRIGG
city: LANCASTER
postcode: LA1 4YW

contact info
Titolo: Ms.
Nome: Chris
Cognome: Needham
Email: send email
Telefono: +44 1524 510314
Fax: +44 1524 510492

 Nazionalità Coordinatore United Kingdom [UK]
 Totale costo 168˙256 €
 EC contributo 168˙256 €
 Programma FP7-PEOPLE
Specific programme "People" implementing the Seventh Framework Programme of the European Community for research, technological development and demonstration activities (2007 to 2013)
 Code Call FP7-PEOPLE-2007-2-1-IEF
 Funding Scheme MC-IEF
 Anno di inizio 2008
 Periodo (anno-mese-giorno) 2008-07-01   -   2010-06-30

 Partecipanti

# participant  country  role  EC contrib. [€] 
1    LANCASTER UNIVERSITY

 Organization address address: BAILRIGG
city: LANCASTER
postcode: LA1 4YW

contact info
Titolo: Ms.
Nome: Chris
Cognome: Needham
Email: send email
Telefono: +44 1524 510314
Fax: +44 1524 510492

UK (LANCASTER) coordinator 0.00

Mappa


 Word cloud

Esplora la "nuvola delle parole (Word Cloud) per avere un'idea di massima del progetto.

modality    human    eyes    indicator    patterns    computing    sensors    context    certain    day    ambient    interaction    everyday    gaze    plan    eye    recognition    environment    wearable   

 Obiettivo del progetto (Objective)

'With the increasingly ubiquitous use of computing devices in our environment, there is a clear need for new methods of human computer interaction. In order to support day-to-day human activity seamlessly, it is essential that computing systems have a sense of the user’s situation or context. As a consequence, activity recognition has become a central research challenge toward ambient intelligence. The proposed work aims to make three distinct contributions to the study of activity recognition: First, this project will introduce the use of eye movement patterns as a novel sensing modality for wearable activity recognition. Almost everything most of us do is guided by the use of our vision system., and Consequently, by studying what our eyes are doing, clues can be gathered as to what it is that we are doing, or intend to do. Beyond established gaze tracking, this work will look at the general patterns our eyes make during certain tasks and in certain situations, as context for interaction. Secondly, we aim to study robust activity recognition in a realistic, everyday setting. In a novel approach, we plan to exploit the synergy of information from wearable sensors - on user actions - with that of ambient sensors – on the environment being manipulated - and demonstrate this through recognition of a number of everyday activities. Thirdly, we will investigate how activity recognition can be used to infer user attention. Eye gaze is a good indicator of this (and has been used in the past), but this is not always a convenient modality to use. In this work we plan to assess the use of activity, in particular locomotion, recognition as an indicator of user attention. A final aspect of the proposal will be in the methodology applied to implementing and evaluating the various recognition problems encountered in this work – advancing the fellow's fundamental work on performance evaluation for activity recognition.'

Altri progetti dello stesso programma (FP7-PEOPLE)

TORTELLEX (2015)

Giovanni Tortelli's Orthographia and Greek studies in Xvth century Europe

Read More  

TOMOSLATE (2015)

"New uses for X-ray Tomography in natural building stones: characterization, pathologies and restoration of historical and recent roofing slates"

Read More  

MOTIF (2009)

Modern Methods of Operator Algebras for Time-Frequency Analysis

Read More