Langroudi, George, Jordanous, Anna, Li, Ling (2018) Music Emotion Capture: sonifying emotions in EEG data. In: Emotion Modelling and Detection in Social Media and Online Interaction symposium at the AISB 2018 Convention, 4-6 April 2018, Liverpool, UK. (KAR id:66564)
PDF (Paper will also be published open access by the conference by April 2018)
Author's Accepted Manuscript
Language: English |
|
Download this file (PDF/2MB) |
Preview |
Request a format suitable for use with assistive technology e.g. a screenreader | |
Official URL: http://collab.di.uniba.it/aisbemotions/ |
Abstract
People’s emotions are not always obviously detectable, due to difficulties expressing emotions, or geographic distance (e.g. if people are communicating online). There are also many occasions where it would be useful for a computer to be able to detect users’ emotions and respond to them appropriately. A person’s brain activity gives vital clues as to emotions they are experiencing at any one time. The aim of this project is to detect, model and sonify people’s emotions. To achieve this, there are two tasks: (1) to detect emotions based on current brain activity as measured by an EEG device; (2) to play appropriate music in real-time, representing the current emotional state of the user. Here we report a pilot study implementing the Music Emotion Capture system. In future work we plan to improve how this project performs emotion detection through EEG, and to generate new music based on emotion-based characteristics of music. Potential applications arise in collaborative/assistive software and brain-computer interfaces for non-verbal communication.
- Link to SensusAccess
- Export to:
- RefWorks
- EPrints3 XML
- BibTeX
- CSV
- Depositors only (login required):