Networked_Music_Review

GypsyMIDI: Body as Musical Instrument

gypsey.pngThe Gypsy MIDI controller turns the human body into a musical instrument, gizmag, January 26, 2006.

Dance and music go together. Intuitively, we know they have common elements, and while we cannot even begin to understand what they are or how they so perfectly complement one another, it is clear that they are both are an expression of something deep and fundamental within all human beings. Both express things that words cannot – beyond intellect, they are perhaps two of the fundamental building blocks of human expression, common to the souls of all people. Which is why when we saw this machine which links the two, we knew there was something special brewing. The GypsyMIDI is a unique instrument for motion-capture midi control – a machine that enables a human being to become a musical instrument – well, a musical instrument controller to be exact, or a bunch of other things depending on your imagination.

Most importantly, the entire package is commercially available with extensive customisation features so that you can decide what each movement triggers – a colour, a sound, or perhaps something else again – anything that can be controlled by a digital interface. The set-up and operation is simple, intuitive and quick and the possibilities for performance art and musical applications are landmark. One arm costs UKP480 (US$855), the whole MIDI suit costs UKP940 (US$1675), and the whole shebang (MIDI Suit, Wireless Interface, Tripod Stand, interface software, Manuals & Videos CD) goes for UKP1240 (US$2210) – that’s the total price for beginning work in new dimension. Like we said – landmark.

The suit is modeled on the human skeletal form using rotational sensors at the joints. The GypsyMIDI simply plugs into a MIDI interface and arm movements are converted into a real-time stream of MIDI data. The mapping interface eXo-software allows the user to define how the movements are translated into MIDI control, including the ability to trigger notes, generate continuous control commands or even play scales.

Software included with the suit lets the user control any MIDI-enabled program including Cubase, Live, Logic Audio, ProTools, MotionBuilder, Reason, Traktor DJ Studio and any VST instrument or effect. Real-time control of sliders, cross-faders, and buttons allows many parameters such as volume, filter cut-off & resonance to be manipulated instantly.

The concept for the Midi suit started to evolve in San Francisco late ’90s dance scene. Seeing how the body expressed music through dance lead us to experiment with the existing Gypsy Mocap system designed for 3D animation. The company wanted to explore the possibility of orchestrating and composing music for real time performance through body movements and dance. This was the beginning of the discovery of a diverse multimedia instrument that promises to add new dimensions to live performance for visual artists, DJs and musicians for years to come.

Now artists can have the advantage of a body instrument that allows music authoring in real time performance. This Mocap Midi controller suit translates body movements into sounds, loops, lights and visuals, completely merging performers and their art and enabling a wide range of musical and visual applications. [via new media : ryan peter andre tobin]


Nov 8, 2006

Comments are closed.

Interviews

Current interview:
Robin Meier, Ali Momeni and the sound of insects

Previous Interviews:

Tags


livestage music sound performance calls + opps installation audio/visual radio festival instrument networked audio interactive experimental electronic workshop video participatory writings event mobile exhibition concert live collaboration electroacoustic environment nature reblog distributed soundscape field recording net_music_weekly improvisation software history locative media space public noise recording immersion voice acoustic sonification lecture generative conference body tool sound sculpture net art art + science VJ/DJ light diy remix site-specific perception mapping film visualization listening laptop algorithmic multimedia city urban data wearable architecture open source game virtual biotechnology sound walk spatialization webcast hacktivism robotic image score platform electromagnetic new media cinema ecology found news composer telematic interface streaming residency interviews/other sensor dance circuit bending synesthesia physical political notation intervention object controller broadcasts conversation narrative second life responsive mashup place technology ambient social network symposium motion tracking hybrid intermedia augmented spoken word livecoding text phonography auralization acousmatic upgrade! gesture opera aesthetics mixed reality resource theory processing 8bit orchestra nmr_commission wireless device toy wireless network theater web 2.0 presentation community surveillance p2p 3D copyright soundtrack research podcast sample feedback psychogeography social chance interdisciplinary tactile recycle interview language systems code emergence presence cassette privacy free/libre software media play chiptune newsletter place-specific archives avatar education haptics activist surround sound audio tour glitch hardware tactical identity bioart asynchronous business tv tangible composition animation jazz transmission arts apps tag e-literature collective microsound relational synchronous Artificial Intelligence conductor convergence reuse simulation ubiquitous synthesizers im/material
3D 8bit acousmatic acoustic activist aesthetics algorithmic ambient animation apps architecture archives art + science Artificial Intelligence asynchronous audio audio/visual audio tour augmented auralization avatar bioart biotechnology body broadcasts business calls + opps cassette chance chiptune cinema circuit bending city code collaboration collective community composer composition concert conductor conference controller convergence conversation copyright dance data distributed diy e-literature ecology education electroacoustic electromagnetic electronic emergence environment event exhibition experimental feedback festival field recording film found free/libre software game generative gesture glitch hacktivism haptics hardware history hybrid identity im/material image immersion improvisation installation instrument interactive interdisciplinary interface intermedia intervention interview interviews/other jazz language laptop lecture light listening live livecoding livestage locative media mapping mashup media microsound mixed reality mobile motion tracking multimedia music narrative nature net art networked net_music_weekly new media news newsletter nmr_commission noise notation object open source opera orchestra p2p participatory perception performance phonography physical place place-specific platform play podcast political presence presentation privacy processing psychogeography public radio reblog recording recycle relational remix research residency resource responsive reuse robotic sample score second life sensor simulation site-specific social social network software sonification sound soundscape sound sculpture soundtrack sound walk space spatialization spoken word streaming surround sound surveillance symposium synchronous synesthesia synthesizers systems tactical tactile tag tangible technology telematic text theater theory tool toy transmission arts tv ubiquitous upgrade! urban video virtual visualization VJ/DJ voice wearable web 2.0 webcast wireless device wireless network workshop writings

Archives

2017

Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2016

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2015

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2014

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2013

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2012

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2011

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2010

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2009

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2008

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2007

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2006

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2005

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2004

Dec | Nov | Oct | Sep | Aug | Jul

What is this?

Networked_Music_Review (NMR) is a research blog that focuses on emerging networked musical explorations.

Read more...

NMR Commissions

NMR commissioned the following artists to create new sound art works. More...
More NMR Commissions

Net_Music_Weekly

"Two Trains" by Data-Driven DJ aka Brian Foo

Two Trains: Sonification of Income Inequality on the NYC Subway by Data-Driven DJ aka Brian Foo: The goal of this song is to emulate a ride on the New York City Subway's 2 Train ... Read more
Previous N_M_Weeklies

Bloggers

Guest Bloggers:

F.Y.I.

Feed2Mobile
Massachusetts Cultural Council
networked_performance
Networked: a (networked_book) about (networked_art)
New American Radio
New Radio and Performing Arts, Inc.
New York State Council on the Arts, a State agency
New York State Music Fund
Turbulence
Upgrade! Boston

Turbulence Works