TY - JOUR
T1 - The space between us: A live performance with musical score generated via affective correlates measured in eeg of one performer and an audience member
AU - Eaton, J
AU - Jin, W
AU - Miranda, E
PY - 2014/1/1
Y1 - 2014/1/1
N2 - The Space Between Us is a performance piece for vocals, piano and live electronics using a Brain-Computer Music Interface system currently in development. The brainwaves of one performer and one audience member are measured throughout the performance and the system generates a real-time score mapped to emotional features associated with the brain signals. The system not only aims to portray emotional states through music but also to direct and induce emotional states through the real-time generation of the score, highlighting the potential of direct neural-emotional manipulation by live performance. Two accepted emotional descriptors, valence and arousal, are measured via electroencephalogram (EEG) recordings and the two-dimensional correlates of averaged windows are then mapped to musical phrases. These pre-composed phrases contain associated emotional content based on the KTH Performance Rules System (Director Musices). The piece is in three movements, the first two are led by the emotions of each subject respectively, whilst the third movement interpolates the combined response of the performer and audience member. The system not only aims to reflect the individuals’ emotional states but also attempts to induce a shared emotional experience by drawing the two responses together. This work highlights the potential available in effecting neural-emotional manipulation within live performance and demonstrates a new approach to real-time, affectively-driven composition.
AB - The Space Between Us is a performance piece for vocals, piano and live electronics using a Brain-Computer Music Interface system currently in development. The brainwaves of one performer and one audience member are measured throughout the performance and the system generates a real-time score mapped to emotional features associated with the brain signals. The system not only aims to portray emotional states through music but also to direct and induce emotional states through the real-time generation of the score, highlighting the potential of direct neural-emotional manipulation by live performance. Two accepted emotional descriptors, valence and arousal, are measured via electroencephalogram (EEG) recordings and the two-dimensional correlates of averaged windows are then mapped to musical phrases. These pre-composed phrases contain associated emotional content based on the KTH Performance Rules System (Director Musices). The piece is in three movements, the first two are led by the emotions of each subject respectively, whilst the third movement interpolates the combined response of the performer and audience member. The system not only aims to reflect the individuals’ emotional states but also attempts to induce a shared emotional experience by drawing the two responses together. This work highlights the potential available in effecting neural-emotional manipulation within live performance and demonstrates a new approach to real-time, affectively-driven composition.
M3 - Conference proceedings published in a journal
SN - 2220-4792
VL - 0
SP - 593
EP - 596
JO - Proceedings of the International Conference on New Interfaces for Musical Expression
JF - Proceedings of the International Conference on New Interfaces for Musical Expression
IS - 0
ER -