Explore the words cloud of the MUSICAL-MOODS project. It provides you a very rough idea of what is the project "MUSICAL-MOODS" about.
The following table provides information about the project.
Coordinator |
UNIVERSITA DEGLI STUDI DI ROMA TOR VERGATA
Organization address contact info |
Coordinator Country | Italy [IT] |
Project website | http://www.musicalmoods2020.org/ |
Total cost | 244˙269 € |
EC max contribution | 244˙269 € (100%) |
Programme |
1. H2020-EU.1.3.2. (Nurturing excellence by means of cross-border and cross-sector mobility) |
Code Call | H2020-MSCA-IF-2014 |
Funding Scheme | MSCA-IF-GF |
Starting year | 2015 |
Duration (year-month-day) | from 2015-12-01 to 2019-04-02 |
Take a look of project's partnership.
# | ||||
---|---|---|---|---|
1 | UNIVERSITA DEGLI STUDI DI ROMA TOR VERGATA | IT (ROMA) | coordinator | 244˙269.00 |
2 | THE REGENTS OF THE UNIVERSITY OF CALIFORNIA | US (OAKLAND CA) | partner | 0.00 |
The project aims at the development of an online database of scores, lyrics and musical excerpts, vector-based 3D animations, and dance video recordings, indexed by mood. Such a taxonomy of relations between the musical, linguistic and motion domains will be aimed at interactive music systems and music making. For realising the database, digital scores inclusive of lyrics will be gathered from collections of music in the public domain. Music mood classification using audio and metadata will aim at capturing sophisticated features but using no explicit domain-specific knowledge about a mental state. Datasets will be realised through a cross-modal approach. Validation of the model will be carried out by combining results from an online game-with-a-purpose, for Internet users, and from intermedia case studies for selected dancers. In further case studies, music works will be realised, also by invited artists, for the evaluation of the database in interactive music making. An online call for artists to use the database in music making or sound generation will be aimed at extending the evaluation further. The final database will be made available online for further exploitation. The present research will generate new knowledge for use in next-generation systems of interactive music and music emotion recognition, also contributing to extend the investigation in the broader areas of music making, computational creativity and information retrieval.
year | authors and title | journal | last update |
---|---|---|---|
2019 |
Fabio Paolizzo M-GWAP: An Online and Multimodal Game With A Purpose in WordPress for Mental States Annotation published pages: , ISSN: , DOI: |
arXiv.org | 2019-09-02 |
2017 |
Fabio Paolizzo, Colin Johnson Autonomy in the Interactive Music System VIVO: A New Framework published pages: , ISSN: , DOI: |
arXiv.org | 2019-09-02 |
2019 |
Fabio Paolizzo, Natalia Pichierri, Daniele Casali, Daniele Giardino, Marco Matta, Giovanni Costantini Multilabel Automated Recognition of Induced Emotions through Music published pages: , ISSN: , DOI: |
arXiv.org | 2019-09-02 |
2018 |
Giovanni Costantini, Daniele Casali, Fabio Paolizzo, Marco Alessandrini, Alessandro Micarelli, Andrea Viziano, Giovanni Saggio Towards the enhancement of body standing balance recovery by means of a wireless audio-biofeedback system published pages: 74-81, ISSN: 1350-4533, DOI: 10.1016/j.medengphy.2018.01.008 |
Medical Engineering & Physics 54 | 2019-09-02 |
2019 |
Fabio Paolizzo, Colin Johnson Creative Autonomy Through Salience and Multidominance in Interactive Music Systems: Evaluating an Implementation published pages: , ISSN: 0929-8215, DOI: |
Journal of New Music Research NNMR-2018-0017.R1 (under review | 2019-09-02 |
2017 |
Alessandrini M., Micarelli A., Viziano A., Pavone I., Costantini G., Casali D., Paolizzo F. & Saggio, G. (2017). Body-worn triaxial accelerometer coherence and reliability related to static posturography in unilateral vestibular failure. In: Acta Otorhinolaryngol Italica. published pages: 231–236, ISSN: 1827-675X, DOI: 10.14639/0392-100x-1334 |
Acta Otorhinolaryngol Ital. 2017 Jun;Vol.37 (3) | 2019-09-02 |
2018 |
Fabio Paolizzo Enabling Embodied Analogies in Intelligent Music Systems published pages: , ISSN: , DOI: |
A Body of Knowledge Conference 2016 Conference Proceedings (20 | 2019-09-02 |
Are you the coordinator (or a participant) of this project? Plaese send me more information about the "MUSICAL-MOODS" project.
For instance: the website url (it has not provided by EU-opendata yet), the logo, a more detailed description of the project (in plain text as a rtf file or a word file), some pictures (as picture files, not embedded into any word file), twitter account, linkedin page, etc.
Send me an email (fabio@fabiodisconzi.com) and I put them in your project's page as son as possible.
Thanks. And then put a link of this page into your project's website.
The information about "MUSICAL-MOODS" are provided by the European Opendata Portal: CORDIS opendata.