1-1 of 1 results  for:

  • Dance and Music x
  • Electronic Instruments x
Clear all

Article

Anne Beetem Acker

[MO]

Wireless motion-capture devices and software components that combine to create gesture-operated musical instruments from practically any object. They are the result of a research project at the Institut de Recherce et Coordination Acoustique/Musique (IRCAM) involving NoDesign, a product design firm. The investigators include Nicolas Rasamimanana, Frederic Bevilacqua, Norbert Schnell, Fabrice Guedy, Emmanuel Flety, Come Maestracci, and Bruno Zamborlin of IRCAM and Jean-Louis Frechin and Uros Petrevski of NoDesign. The second generation of MO prototypes was created by DaFact, a MIDI firm based in Paris. The project won first place in the Margaret Guthman Musical Instrument Competition in 2012.

The components are designed to enable users to create novel instruments without knowledge of programming, engineering, or electronics. Software components include motion capture, gesture analysis and recognition, and real-time audio processing; these are integrated into Max/MSP (Max signal processing), an interactive data-flow environment for audio, visual, and graphic programming. Examples of desired gestures are recorded by the user for recognition by the system after a single training session. Gestures can be recognized using either discrete triggering or continuous control. Audio processing is provided by a set of synthesis and sound transformation modules that enable recorded sounds to be modified, for example using granular or phase vocoder techniques to alter some sound characteristics while preserving others, such as stretching a sound in time without changing pitch....