Show Summary Details

Page of

Printed from Grove Music Online. Grove is a registered trademark. Under the terms of the licence agreement, an individual user may print out a single article for personal use (for details see Privacy Policy).

Subscriber: null; date: 28 January 2020

Brain-computer music interface [BCMI]locked

  • Anne Beetem Acker

Technology that allows a person to control a music-related output with commands expressed by brain signals. The output signal can control physical and virtual instruments and composition systems. Therapeutic applications include allowing severely physically disabled persons to participate actively in music-making. A number of methods of detecting and measuring brain activity have been tried; electroencephalography (EEG) has proved to be the most practical. Neural activity generates electric fields that can be detected by EEG electrodes placed on the scalp. The electrodes are placed in an array that allows mapping of neural activity over time. The signals are very weak and must be amplified and broken into frequency bands commonly labeled from low to high as Theta, Delta, Alpha, low Beta, medium Beta, and Gamma.

Three approaches are used for making music with BCMI. The most common approach, ‘direct sonification’, translates the EEG signal directly into sound; ‘musification’ translates the EEG signal by generating musical sequences based on the EEG behavior; ‘control’ detects specific EEG patterns produced by the subject to control music software. Alvin Lucier made the first known efforts at music performance with the EEG in the 1960s with his piece ‘Solo Performer’. He amplified the signals picked up from electrodes on his scalp, relaying them through loudspeakers directly coupled to percussion instruments including gongs, cymbals, tympani, and drums. David Rosenboom systematically studied the detection of human musical experience in EEG signals in the 1970s. A BCMI piano developed by Eduardo Miranda in 2007 uses information from EEG power spectrum analysis to direct an artificial intelligence system that sends continuous MIDI information online to a MIDI-enabled acoustic piano. To produce each bar of music, the system checks the EEG and triggers generative music commands associated with the dominant EEG rhythm. Additional components of the signal control loudness and tempo.

Fig.1: Dr. Eduardo Miranda connecting a students brain to play a piano. Courtesy of Eduardo Miranda.

In 2008, Miranda and Vincent Soucaret devised a BCMI audio mixer where Alpha rhythms control the fader for a guitar solo on track three while low and medium Beta rhythms control a piano recording on track two. The first track contains a steady rhythm of bass and drums. Subjects easily trained themselves to control the mixer at will. Another project by Soucaret (2008) associated different notes with different electrodes to generate melodies. By tracking both Alpha and Beta rhythms and other EEG information simultaneously they were able to generate two concurrent melodies as well as polyphonic music. In 2011, Miranda and others at the Interdisciplinary Centre for Computer Music Research at the University of Plymouth published results of using their EEG-based system on a subject with locked-syndrome, a condition where the mind is aware but the individual cannot move or communicate verbally due to paralysis of nearly all voluntary muscles. The subject was able quickly to learn to change the amplitude of her EEG to vary melodic and dynamic output, to copy notes played on a piano, and to play a melody with an independent background track.


  • D. Rosenboom: Biofeedback and the Arts, Results of Early Experiments (Vancouver, 1976)
  • D. Rosenboom: ‘The Performing Brain’, Computer Music Journal, vol.14 (1990), 48–65
  • E. Miranda: ‘Brain–Computer Music Interfacing: from Basic Research to the Real World of Special Needs’, Music and Medicine, vol.3 (2011), 134–40
  • E. Miranda: ‘Plymouth Brain–Computer Music Interfacing Project: from EEG Audio Mixers to Composition Informed by Cognitive Neuroscience’, International Journal of Arts and Technology, vol.3 (2010), 154–76