Brain-computer music interface for composition and performance

Research output: Contribution to journalArticlepeer-review

Abstract

This paper introduces a new brain-computer interface (BCI) system that uses electroencephalogram (EEG) information to steer generative rules in order to compose and perform music. It starts by noting the various attempts at the design of BCI systems, including systems for music. Then it presents a short technical introduction to EEG sensing and analysis. Next, it introduces the generative music component of the system, which employs our own adapted version of a machine-learning technique based on ATNs (Augmented Transition Networks) for the computer-replication of musical styles. The system constantly monitors the EEG of the subject and activates generative rules associated with the activity of different frequency bands of the spectrum of the EEG signal. The system also measures the complexity of the EEG signal in order to modulate the tempo (beat) and dynamics (loudness) of the performance. Subjects can be trained to select between different classes of generative rules to produce original pieces of music.
Original languageEnglish
Pages (from-to)119-125
Number of pages0
JournalInternational Journal on Disability and Human Development
Volume5
Issue number2
Publication statusPublished - 20 Nov 2006

Keywords

  • Assistive music technology
  • Brain-Computer Interface
  • generative music systems
  • bio-signal music controller

Fingerprint

Dive into the research topics of 'Brain-computer music interface for composition and performance'. Together they form a unique fingerprint.

Cite this