Skip to main content

Brain-Computer Music Interfacing (BCMI): From Basic Research to the Real World of Special Needs

Miranda, E.R., Magee, W.L., Wilson, J.J., Eaton, J., Palaniappan, Ramaswamy (2011) Brain-Computer Music Interfacing (BCMI): From Basic Research to the Real World of Special Needs. Music & Medicine, 3 (3). pp. 134-140. ISSN 1943-8621. (doi:10.1177/1943862111399290) (The full text of this publication is not currently available from this repository. You may be able to access a copy if URLs are provided)

The full text of this publication is not currently available from this repository. You may be able to access a copy if URLs are provided. (Contact us about this Publication)
Official URL
http://dx.doi.org/10.1177/1943862111399290

Abstract

This paper reports on the development of a proof-of-concept brain-computer music interfacing system (BCMI), which we built to be tested with a patient with Locked-in Syndrome at the Royal Hospital for Neuro-disability, in London. The system uses the Steady State Visual Evoked Potential (SSVEP) method, whereby targets are presented to a user on a computer monitor representing actions available to perform with the system. Each target is encoded by a flashing visual pattern reversing at a unique frequency. In order to make a selection, the user must direct her gaze at the target corresponding to the action she would like to perform. The patient grasped the concept quickly and rapidly demonstrated her skill at controlling the system with minimal practice. She was able to vary the intensity of her gaze, thus changing the amplitude of her EEG and vary the consequent musical parameters. We have proved the concept that such a BCMI system is cost-effective to build, viable, and useful. However, ergonomic and design aspects of the system require further refinement in order to make it more practical for clinical usage. For instance, the system at present requires a therapist to place individual electrodes and calibrate a userâ??s response to each stimulus, which can be time consuming. A new version of the system will require just positioning of a headset and, due to advanced algorithms, will require no calibration. © 2011, The Author(s). All rights reserved.

Item Type: Article
DOI/Identification number: 10.1177/1943862111399290
Additional information: Unmapped bibliographic data: LA - English [Field not mapped to EPrints] J2 - Music Med. [Field not mapped to EPrints] AD - Interdisciplinary Centre for Computer Music Research (ICCMR), Faculty of Arts, University of Plymouth, Plymouth, United Kingdom [Field not mapped to EPrints] AD - Institute of Neuropalliative Rehabilitation, Royal Hospital for Neuro-disability, West Hill, London, United Kingdom [Field not mapped to EPrints] AD - Brain-Computer Interfaces Group, School of Computer Science, Electronic Engineering, University of Essex, Colchester, United Kingdom [Field not mapped to EPrints] DB - Scopus [Field not mapped to EPrints] M3 - Article [Field not mapped to EPrints]
Uncontrolled keywords: arts medicine, music medicine, music therapy, palliative care
Divisions: Faculties > Sciences > School of Computing > Data Science
Depositing User: Palaniappan Ramaswamy
Date Deposited: 12 Dec 2018 22:15 UTC
Last Modified: 30 May 2019 08:29 UTC
Resource URI: https://kar.kent.ac.uk/id/eprint/70707 (The current URI for this page, for reference purposes)
Palaniappan, Ramaswamy: https://orcid.org/0000-0001-5296-8396
  • Depositors only (login required):