Networked_Music_Review

Robotic Ecologies and Emergent Systems in Music

medusa.jpgThis past spring at the University of Virginia, a first-time joint class was offered that brought graduate students from the Virginia Center for Computer Music (VCCM) together with undergraduates in the School of Architecture.The undergraduate Robotic Ecologies class merged with the Emergent Systems in Music graduate class, and was co-taught by professors Jason Johnson (architecture) and Matthew Burtner (music), with assistance from music graduate student Troy Rogers. I had the opportunity to participate in this exciting new venture between our departments. The goal of this year’s class was for students to create and fabricate “performative spatial and acoustic instruments that sense, compute and interact to/with emergent atmospheric inputs.” The class’s group collaborations resulted in three new robotic sonic-spatial instruments. Movies and descriptions of the instruments are provided below. Descriptions were provided by the groups and video footage was provided by Jason Johnson.

E.X.S.O. (Emergent Proximity Sensing Object)
Team Members: Scott Barton, Jaime De La Ree, Steven Johnson, Steven Kemper, Kezia Ofiesh


E.X.S.O.

E.X.S.O. is designed for human participation in the production of rhythms. As people interact with the moving arms, the arms respond in an immediate one to one fashion, and additionally generate rhythms played on resonant tubes. The tempo of these rhythms is based on proximity to the device. As the arm moves in relation to the human participants, the pitch of the tube changes. At first, participants will notice a one to one relationship between their proximity and the rhythms produced, but as time goes on, the system will begin to react on its own to the humans in the room–working with them, working against them, or ignoring them completely.

The skin that connects one arm to the next is a sub-structural system intended to create lateral structural stability and also to serve as a generative spatial component. As the arms move independently of one another, the skin takes on several dynamic shapes that conform to the three arm positions. The structural skin can take on many spatial qualities that result from the proximity sensors input. While the Infrared sensors serve the scale of a small scale presentation, the input could work of any type of sensor; this could make the space changing quality of the arm become a more functional component or larger scale design.

Arm movement is controlled by a DC motor attached to gears that interface with the part of the arm that enters the tube. This motor simultaneously changes the tube’s pitch and the arm’s position. A solenoid motor is connected to a beater that strikes the tube to produce sound. This sound is captured and amplified by electric microphones at the end of the tubes. LEDs attached to the arm inside the tube will illuminate when the arm moves, providing a visual trace of each arm’s movement and a visual notation of the sound being produced. The entire process is controlled by a computer running Max/MSP which interfaces with an Arduino microcontroller attached to the sensors, motors, and LEDs. Software parses the data received from the sensors and internal algorithmic processes produce emergent behavior as the arm reacts to its human observers.

Instrument Materials: 1/4″ Plexy, 1/8″ Plexy, 1/4″ threaded rod, 3/16″ nuts and bolts, zip ties, birch wood, wool fabric, ¾” Clear tube. Hardware: 3 24VDC reversible gearhead motors, 3 24VDC Ledex Solenoid motors, 6 ultrabrite aqua LEDs, 3 IR sensors (Sharp GP2Y0A21YK), 1 24V power supply, 2 Arduino Diecimila micro-controllers, 6 LED’s. Software: Arduino running Fermata 1.0, Max/MSP 4.6

Medusa
Team Members: Steven Brummond, Taylor Burgess, Yuri Spitsyn, Jonathan Zorn, Susanna Wong


Medusa

In Greek Mythology Medusa was once the most beautiful woman in the world until she angered the goddess Athena who turned into a hideous monster whose hair was made of snakes. She could transform any active man into stone with a single look. The hero Perseus eventually defeated her by cutting her head off; from which Pegasus the winged horse was born.

Medusa is an emergent instrumental environment which reacts to human force. Medusa depends on a field of modules that are individually activated by the touch of a person. When one module is activated it will change the states of its neighbors. State changes are registered by the humming of the module. The individual modules are comprised of a half spherical acrylic structure, a single solenoid in the center, a drum head, LED lights, a rotating motor on one side and a piezo disc connected to piano wire on the other side. The basic module is triggered when a person hits the piano wire. This in turn triggers the solenoid which hits the drum, effectively changing the state of the module. The state of the module refers to the humming. The humming is produced by a gear which rubs against a guitar string creating vibrations into the drum head generating sound. The speed of the motor is a function of the force a person applies to the piano wire. Once a module is triggered the delay time does not allow for the module to be triggered again for another ten seconds. The emergence of MEDUSA develops from the array of people hitting the piano wire with different forces. The different modules will continuously change state and react with different speeds of the motor. The myriad of reactions begin to develop a pattern of emergence through variation and consistency of reactions.

Instrument Materials: acrylic structure, polyester plastic drum head, guitar string, piano wire, threaded rods, bolts, LED lights, piezo disc. Hardware: 3 Arduino microcontrollers, advanced circuits. Software: MAX/MSP.

Panta Rhei
Team Members: Andrew Hamm, Lanier Sammons, Jen Siomacco, Wendy Stober. Peter Traub


Panta Rhei

The concept of Panta Rhei derives from the philosophy of Heraclitus, the pre-Socratic Ionian philosopher. Translated, Panta Rhei means “everything is in a state of flux.” Heraclitus is well noted for his belief that constant change is central to the state of the universe.

Panta Rhei is an audio/visual instrument capable of displaying an emergent system in light, allowing human interaction with that system, and translating the resulting information into both music and robotic choreography. Human interaction happens within the grid as observers insert their hands to block the flow of light between LEDs and corresponding photoresistors. The sonic elements of the piece are realized with Max/MSP. The brightness levels of individual LEDs (or groups of LEDs) may be made musical in several ways. In the current incarnation, LEDs are tied to a bank of oscillators whose envelope and pitch are determined by the level of light. A Mylar skin manipulated by solenoids provides the robotic choreography. The solenoids also respond to changes in the light level of the LED/photosensor grid. Data from the grid is monitored in Max/MSP and relayed to the solenoids through a microcontroller.

Instrument Materials: Acrylic, piano wire, plastic zip-ties, mylar, metal brad connectors. Hardware: 12 Solenoids, 4 Arduino Microcontrollers, 18 LEDs and 18 photosensors. Software: Max/MSP


May 28, 2008
Trackback URL

Leave a comment

Interviews

Current interview:
Robin Meier, Ali Momeni and the sound of insects

Previous Interviews:

Tags


livestage music sound performance calls + opps installation audio/visual radio festival instrument networked audio interactive experimental electronic workshop video participatory writings event mobile exhibition concert live collaboration electroacoustic environment nature reblog distributed soundscape field recording net_music_weekly improvisation software history locative media space public noise recording immersion voice acoustic sonification lecture generative conference body tool sound sculpture net art art + science VJ/DJ light diy remix site-specific perception mapping film visualization listening laptop algorithmic multimedia city urban data wearable architecture open source game virtual biotechnology sound walk spatialization webcast hacktivism robotic image score platform electromagnetic new media cinema ecology found news composer telematic interface streaming residency interviews/other sensor dance circuit bending synesthesia physical political notation intervention object controller broadcasts conversation narrative second life responsive mashup place technology ambient social network symposium motion tracking hybrid intermedia augmented spoken word livecoding text phonography auralization acousmatic upgrade! gesture opera aesthetics mixed reality resource theory processing 8bit orchestra nmr_commission wireless device toy wireless network theater web 2.0 presentation community surveillance p2p 3D copyright soundtrack research podcast sample feedback psychogeography social chance interdisciplinary tactile recycle interview language systems code emergence presence cassette privacy free/libre software media play chiptune newsletter place-specific archives avatar education haptics activist surround sound audio tour glitch hardware tactical identity bioart asynchronous business tv tangible composition animation jazz transmission arts apps tag e-literature collective microsound relational synchronous Artificial Intelligence conductor convergence reuse simulation ubiquitous synthesizers im/material
3D 8bit acousmatic acoustic activist aesthetics algorithmic ambient animation apps architecture archives art + science Artificial Intelligence asynchronous audio audio/visual audio tour augmented auralization avatar bioart biotechnology body broadcasts business calls + opps cassette chance chiptune cinema circuit bending city code collaboration collective community composer composition concert conductor conference controller convergence conversation copyright dance data distributed diy e-literature ecology education electroacoustic electromagnetic electronic emergence environment event exhibition experimental feedback festival field recording film found free/libre software game generative gesture glitch hacktivism haptics hardware history hybrid identity im/material image immersion improvisation installation instrument interactive interdisciplinary interface intermedia intervention interview interviews/other jazz language laptop lecture light listening live livecoding livestage locative media mapping mashup media microsound mixed reality mobile motion tracking multimedia music narrative nature net art networked net_music_weekly new media news newsletter nmr_commission noise notation object open source opera orchestra p2p participatory perception performance phonography physical place place-specific platform play podcast political presence presentation privacy processing psychogeography public radio reblog recording recycle relational remix research residency resource responsive reuse robotic sample score second life sensor simulation site-specific social social network software sonification sound soundscape sound sculpture soundtrack sound walk space spatialization spoken word streaming surround sound surveillance symposium synchronous synesthesia synthesizers systems tactical tactile tag tangible technology telematic text theater theory tool toy transmission arts tv ubiquitous upgrade! urban video virtual visualization VJ/DJ voice wearable web 2.0 webcast wireless device wireless network workshop writings

Archives

2017

Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2016

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2015

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2014

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2013

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2012

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2011

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2010

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2009

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2008

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2007

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2006

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2005

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2004

Dec | Nov | Oct | Sep | Aug | Jul

What is this?

Networked_Music_Review (NMR) is a research blog that focuses on emerging networked musical explorations.

Read more...

NMR Commissions

NMR commissioned the following artists to create new sound art works. More...
More NMR Commissions

Net_Music_Weekly

"Two Trains" by Data-Driven DJ aka Brian Foo

Two Trains: Sonification of Income Inequality on the NYC Subway by Data-Driven DJ aka Brian Foo: The goal of this song is to emulate a ride on the New York City Subway's 2 Train ... Read more
Previous N_M_Weeklies

Bloggers

Guest Bloggers:

F.Y.I.

Feed2Mobile
Massachusetts Cultural Council
networked_performance
Networked: a (networked_book) about (networked_art)
New American Radio
New Radio and Performing Arts, Inc.
New York State Council on the Arts, a State agency
New York State Music Fund
Turbulence
Upgrade! Boston

Turbulence Works