Opendata, web and dolomites


Anticipatory Human-Computer Interaction

Total Cost €


EC-Contrib. €






Project "ANTICIPATE" data sheet

The following table provides information about the project.


Organization address
postcode: 70174

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country Germany [DE]
 Total cost 1˙499˙625 €
 EC max contribution 1˙499˙625 € (100%)
 Programme 1. H2020-EU.1.1. (EXCELLENT SCIENCE - European Research Council (ERC))
 Code Call ERC-2018-STG
 Funding Scheme ERC-STG
 Starting year 2019
 Duration (year-month-day) from 2019-02-01   to  2024-01-31


Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    UNIVERSITAET STUTTGART DE (STUTTGART) coordinator 1˙499˙625.00


 Project objective

Even after three decades of research on human-computer interaction (HCI), current general-purpose user interfaces (UI) still lack the ability to attribute mental states to their users, i.e. they fail to understand users' intentions and needs and to anticipate their actions. This drastically restricts their interactive capabilities.

ANTICIPATE aims to establish the scientific foundations for a new generation of user interfaces that pro-actively adapt to users' future input actions by monitoring their attention and predicting their interaction intentions - thereby significantly improving the naturalness, efficiency, and user experience of the interactions. Realising this vision of anticipatory human-computer interaction requires groundbreaking advances in everyday sensing of user attention from eye and brain activity. We will further pioneer methods to predict entangled user intentions and forecast interactive behaviour with fine temporal granularity during interactions in everyday stationary and mobile settings. Finally, we will develop fundamental interaction paradigms that enable anticipatory UIs to pro-actively adapt to users' attention and intentions in a mindful way. The new capabilities will be demonstrated in four challenging cases: 1) mobile information retrieval, 2) intelligent notification management, 3) Autism diagnosis and monitoring, and 4) computer-based training.

Anticipatory human-computer interaction offers a strong complement to existing UI paradigms that only react to user input post-hoc. If successful, ANTICIPATE will deliver the first important building blocks for implementing Theory of Mind in general-purpose UIs. As such, the project has the potential to drastically improve the billions of interactions we perform with computers every day, to trigger a wide range of follow-up research in HCI as well as adjacent areas within and outside computer science, and to act as a key technical enabler for new applications, e.g. in healthcare and education.


year authors and title journal last update
List of publications.
2020 Mihai Bâce, Sander Staal, Andreas Bulling
Quantification of Users’ Visual Attention During Everyday Mobile Device Interactions
published pages: , ISSN: , DOI: 10.1145/3313831.3376449
Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems - CHI \'20 2020-01-30

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "ANTICIPATE" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email ( and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "ANTICIPATE" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.1.)


Efficient Conversion of Quantum Information Resources

Read More  

PROCOMM (2019)

Commercialisation of Proteus

Read More  

WE (2020)

Who are we? Self-identity, Social Cognition, and Collective Intentionality

Read More