Affective brain–computer music interfacing

Ian Daly, Duncan Williams, Alexis Kirke, James Weaver, Asad Malik, Faustina Hwang, Eduardo Miranda, Slawomir J. Nasuto

Research output: Contribution to journalArticlepeer-review

5 Downloads (Pure)

Abstract

Objective. We aim to develop and evaluate an affective brain–computer music interface (aBCMI) for modulating the affective states of its users. Approach. An aBCMI is constructed to detect a userʼs current affective state and attempt to modulate it in order to achieve specific objectives (for example, making the user calmer or happier) by playing music which is generated according to a specific affective target by an algorithmic music composition system and a casebased reasoning system. The system is trained and tested in a longitudinal study on a population of eight healthy participants, with each participant returning for multiple sessions. Main results. The final online aBCMI is able to detect its users current affective states with classification accuracies of up to 65% (3 class, p < 0.01) and modulate its userʼs affective states significantly above chance level (p < 0.05). Significance. Our system represents one of the first demonstrations of an online aBCMI that is able to accurately detect and respond to userʼs affective states. Possible applications include use in music therapy and entertainment
Original languageEnglish
Pages (from-to)046022-046022
Number of pages0
JournalJournal of Neural Engineering
Volume13
Issue number4
Early online date11 Jul 2016
DOIs
Publication statusPublished - 1 Aug 2016

Fingerprint

Dive into the research topics of 'Affective brain–computer music interfacing'. Together they form a unique fingerprint.

Cite this