Opendata, web and dolomites

RHYTHMSYNC

Rhythm synchronization between music and spoken language

Total Cost €

0

EC-Contrib. €

0

Partnership

0

Views

0

Project "RHYTHMSYNC" data sheet

The following table provides information about the project.

Coordinator
UNIVERSITAET POTSDAM 

Organization address
address: AM NEUEN PALAIS 10
city: POTSDAM
postcode: 14469
website: www.uni-potsdam.de

contact info
title: n.a.
name: n.a.
surname: n.a.
function: n.a.
email: n.a.
telephone: n.a.
fax: n.a.

 Coordinator Country Germany [DE]
 Project website https://www.researchgate.net/profile/Alan_Langus
 Total cost 159˙460 €
 EC max contribution 159˙460 € (100%)
 Programme 1. H2020-EU.1.3.2. (Nurturing excellence by means of cross-border and cross-sector mobility)
 Code Call H2020-MSCA-IF-2016
 Funding Scheme MSCA-IF-EF-ST
 Starting year 2017
 Duration (year-month-day) from 2017-04-01   to  2019-03-31

 Partnership

Take a look of project's partnership.

# participants  country  role  EC contrib. [€] 
1    UNIVERSITAET POTSDAM DE (POTSDAM) coordinator 159˙460.00

Map

 Project objective

Rhythm perception is important for a wide range of higher cognitive abilities ranging from time perception to predicting the occurrence of future events, from perceiving language to dancing to the beat of the music. Despite considerable amount of research in rhythm in various scientific disciplines, the way the human mind perceives and produces rhythm is not fully understood. While many interdisciplinary studies look at how performance in one cognitive domain compares to performance in another, it has proven difficult to link the mechanisms for rhythm perception across perceptual and cognitive domains directly. This project therefore looks how rhythm in music and spoken language interact by relying on rhythm synchronization: i.e. the phenomenon that our mind tends to automatically synchronize our motor-activity to the rhythm we perceive auditory (e.g. tapping a finger to the beat of music, the gestures that accompany speech, and singing). Because rhythm in spoken language and music is shaped by experience with culture specific music and our native language, in addition to adults, this project also studies young infants from birth through the crucial early stages of vocal and motor development. The studies in this project rely on a combination of acoustic analyses and electrophysiological methods (sEGM) to determine how rhythm in language and music is synchronized, how synchronization unfolds in time and how differences in rhythm in the two domains affect rhythm synchronization. By looking at similarities and differences in the rhythm of spoken language and music, the project attempts to create a blue-print of the shared and domain-specific cognitive mechanisms necessary for rhythm processing. Because we are surrounded by rhythm in our everyday life, the study of rhythm synchronization can thus help us understand how different rhythms interact and how they influence our daily life and our behaviour.

Are you the coordinator (or a participant) of this project? Plaese send me more information about the "RHYTHMSYNC" project.

For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.

Send me an  email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.

Thanks. And then put a link of this page into your project's website.

The information about "RHYTHMSYNC" are provided by the European Opendata Portal: CORDIS opendata.

More projects from the same programme (H2020-EU.1.3.2.)

ASIQS (2019)

Antiferromagnetic spintronics investigated by quantum sensing techniques

Read More  

LIGHTMATT-EXPLORER (2019)

Experimental determination of the paraxial-vectorial limit of light-matter interactions

Read More  

MOSAiC (2019)

Multimode cOrrelations in microwave photonics with Superconducting quAntum Circuits

Read More