1-4 of 4 results  for:

  • 21st c. (2000-present) x
  • Compositional Practice and Technique x
Clear all

Article

Hugh Davies

revised by Anne Beetem Acker

[Elektronmusikstudion] (Swed.: ‘electronic music studio’)

The Swedish national centre for electronic music and sound art, in Stockholm. It was preceded by a smaller studio run by the Worker’s Society of Education from 1960. EMS was established by Swedish Radio in 1964 under music director and composer Karl Birger Blomdahl (1916–68), who hired the composer and performer Knut Wiggen (b 1927) to take charge of creating the studios. In 1965 an old radio theatre studio called the klangverstan (‘sound workshop’) opened for composers. Construction of a new facility was begun, but after Blomdahl’s death EMS became independent, funded only in small part by Swedish Radio, and otherwise by Fylkingen (a society for experimental music and arts) and the Swedish Royal Academy of Music.

Wiggen envisioned EMS as both a place to produce electro-acoustic music and a research institution that would give the composer ‘the possibility of describing sounds in psychological terms’. The studio was equipped accordingly. The sound sculpture ...

Article

Anne Beetem Acker

Interactive computer network used as an extended musical instrument, played by a San Franciso Bay–area experimental computer network band also called The Hub. The band, founded in 1985 by Tim Perkis and John Bischoff, evolved from the League of Automatic Music Composers (1978–83). The concept of The Hub is to create live music resulting from the unpredictable behaviour of the interconnected computer system. The composer/performers consider their performances a type of ‘enhanced improvisation’.

Initially The Hub provided a custom-built central ‘mailbox’ computer and made use of a MIDI network providing communication between the composer/performers’ synthesizers. With the maturation of commercial MIDI equipment, the band shifted to using the Opcode Studio V multiport MIDI interface for their hub. Since MIDI is designed to allow one player or computer to control a group of synthesizers but not to allow a network of synthesizers to interact, band member Scot Gresham-Lancaster devised a way to program the system so the Opcode Studio V could route messages among all the synthesizers in the network....

Article

Alexander Bonus

[iPhone, Android, smartphone]

Portable electronic communication device. These have become robust platforms for digital audio production, composition, and music performance since the beginning of the 21st century. Recent compositions for mobile-phone ringtones might represent an emerging music genre. Since 2008, many commercial apps have transformed mobile devices into miniature synthesizers. Popular virtual-instrument programs such as Ocarina (2008) by the Smule Corporation and Band (2008) by MooCowMusic harness the phone’s numerous interfaces in various ways. Multi-point touch screens offer players the ability to manipulate graphical fingerholes, fretboards, drum pads, and keyboards, thus approximating the playing experience of acoustic wind, string, percussion, and keyboard instruments. Beyond its use in voice recording and transmission, a device’s microphone can register breath intensity, enabling users to initiate tones and alter dynamics as though playing a wind instrument.

Some mobile sound-production programs feature real-time voice manipulation, including auto-tune or pitch correction. Additional levels of musical functionality can be mapped to a phone’s accelerometer (an internal speed and direction detector). When the device is swung, shaken, or tilted, the accelerometer can trigger alterations in timbre, vibrato, pitch, and other variables. More advanced uses have been proposed. For example, a phone’s camera, acting as a real-time motion sensor, could affect many aspects of sound synthesis and sequencing; and the GPS (global positioning system) indicator has the potential to take location markers from other phones across the planet and turn those data into sonic information....

Article

Anne Beetem Acker

Table-height electronic display and controller (interface) with a touchscreen top that can detect two or more simultaneous points of contact on its surface. Multi-touch tables typically include an integral computer to process the screen’s input and output as well as any other associated outputs such as audio. The screen surface (a sheet of glass or polymer) is lit by an array of infrared LEDs around the edge of the screen inside the table, and a short-throw projector displays an image (e.g. a keyboard) on the screen from below. Some form of optical touch technology such as surface capacitance, SAW (surface acoustic wave), infrared grid technology, or FTIR (frustrated total internal reflection) is used to detect and locate touches on the image. An internal camera sends data to the computer, which then deduces where the fingers have pressed and uses that information to control an application (app). Multi-touch tables use either custom software or a touchscreen package such as Touchlib....