Networked_Music_Review

Reblogged sparkin’ it up

sparkx.jpgLondon’s audiovisual Howlin’ Wolf (it’s a sideburn thing), Toby Harris (aka *spark), has been steadily building strong live video performances since the turn of the century, exploring his real-time video skills at countless festivals, sophisticated audiovisual performances and most recently on giant touchscreen plasmas within motor shows. He also founded AVIT, the real world spin-off of vjforums.com that prompted festivals around the world, so it was a pleasure to meet him @ Sonar in Barcelona mid 2007, as well as get his reflections on audiovisual possibility. Lotta words to follow, but worth the read for the pixel-inclined…

What appeals about real-time video manipulation, about ‘live cinema’?

The world is catching up with vjs in enjoying a spot of real-time video manipulation: just watch people using PhotoBooth on any modern Mac. It’s compulsive, it’s fun! That term ‘Live Cinema’ is something close to my heart though: I reckon you can specifically and deliberately combine a lot of whats good in established cinema and clubbing to give a completely new way of expressing yourself as a VJ-esque performer while engaging with audiences’s own creative thoughts. The key to it is an improvisational use of narrative, rather than forcing a fixed story down their throats, you could be a cinematic incarnation of the oral storytellers of old, weaving tales on the fly, or providing the scenarios and juxtapositions that people find themselves compulsively mapping their own narratives onto. Stepping back from that, I’m interested in anything that uses media to make people interact or think in unexpected ways, which has taken me from playing with the conventions of one-man theatre to storytelling installations. And the tools are really hotting up at the moment, things are getting interesting.

Describe the live show you’ve developed and have been playing at various festivals…

‘rbnesc’ is a project fusing cinema and live experimental visuals. Presenting a series of character scenarios, it invites the audience to construct narrative and cultural critique: rbnesc >> urban escape. So its about the urban condition; whats happening, the forces acting on it, whether we should be accepting it. Some of this is overt, such as pasting up provocative quotes, some of it you can’t miss, given my visual obsession with CCTV cameras (hard not to living in the UK) and some is for the audience to map their own actions and consequences from the loose narrative arc I present. I hope they wonder whether the escape in rbn_esc is a valid solution…

How does it come together technically ?

I use Ableton Live talking to Vidvox’s VDMX on a macbook pro, with two behringer control surfaces. It allows a sophisticated audio-visual mix, and a template for the performance means I can somewhat improvise the mixing while keeping it together as a whole. I’m really happy that we’re at a point where an ‘engine’ to churn it out in realtime is clearly achievable, but boil it down and its only semi-live, its far from my ideal of being that proverbial oral storyteller, drawing on an archive of memories to make something new every time. Still haven’t seen the kind of interface to be able to truly improvise a fresh take each time. Well, ironically enough, that is except at the cinema in films such as Minority Report.

If you can produce content and have an ear for a soundtrack, it really isn’t that difficult to make an audio-visual setup for yourself with a modern laptop that can quite adequately get you to a ‘semi-live, semi-meaningful’ state, akin to rbn_esc as-is. Get some kind of audio sequencer that you can program in the building blocks of a DJ mix and sound effects, load the shots of your ‘film’ into a vj program that can perform your editing and montage on the fly, and tie it all together with as much midi and ‘knobs and sliders’ as you see fit.

What lead you to dedicate such efforts exploring narrative within live video?

Even starting out as a VJ, I found myself dividing a night of club visuals into discrete sets, each with some kind of theme, playing with hook and flow. Then I got involved in a little theatre outfit, and we explored how my responsiveness onstage with laptop and camera could enhance the act of a stand-up storyteller. Soon enough, we were delving into tv-like documentary sections with b-roll footage edited live to the storyteller’s semi-improvisational speech, we were having the storyteller interact with pre-filmed snippets of his other characters, not to mention many a coup de théâtre switching live cameras with staged pre-recorded chunks… it was a fun time, and really showed the potential of live, improvisational audio-visual media.

What differences emerge from playing similar set of audiovisual material, as opposed to playing a similar set of music again?

You can listen to that cd seemingly ad-infinitum, but the dvd will only get a play or two. there’s just something different in the way we experience a film to music. i don’t have the answers here, but thats kinda the point: there’s space between these two forms and that’s what we’re exploring. it could be that the film’s devotion to a all-consuming narrative and its set up to deliver an exact experience to you as you watch it means it leaves nothing to interest you on a second viewing, or it could be that the visual image is literal rather than abstract and once you’ve seen it, well, you’ve seen it. at the moment, I can only perform one route through my live-cinema piece, and so i have to rely on fresh audiences – not so hard given its a niche entertainment form – but my next big project is about giving me the tools as a performer to truly start exploring this.

As though to prove the live video performer is not checking their email, you were involved with an innovative trade show presentation with large touch screen technology, can you explain that?

I was asked to work with a production company developing a vj installation to be used as a central attraction of a motor show stand. A groundbreaking project as a whole, working on three 65” touchscreen plasmas surrounded by the public was quite something. Imagination, the production company, created a bespoke application that allowed us to playlist content submitted from the public around us, which we then published and imported into the vj setup I created on the central screen. The real innovation though was in the project’s raison d’être: interacting with the audience to create films that embrace them, putting the audience up alongside the über-produced brand films playing on the mighty LED walls. For that, and for realising it was vjs who could make the magic there, Imagination deserve a lot of praise.

How did it feel to VJ in that kind of spotlight?

We were making a five minute mix every twenty, all day, every day, in front of people who’d never seen anything like it. It was quite something, especially when they saw themselves on the six meter high led wall we were outputting to, or heard their voice booming over the stand’s PA. What really impressed me, was how working on that kind of surface really transforms the act of performance – arms flailing everywhere – and how an interface designed specially for it can really communicate to the public just what it is you’re doing.

Relentlessly, digital tools are making it easier to make music or video. Who are VJs producing work you admire, and why do they stand out?

– the Light Surgeons for so early on nailing the idea of an audio-visual performance broken out of the screen and into the fabric of the venue.
bauhouse for so perfectly realising what I see as the vj/av approach in their high-end ‘montage on the beat’ productions.
visualnaut, a good friend and collaborator over the years first with avit and then with narrative lab. Simply put, he’s a genius.
and I recently bumped back into ameoba, whose been trailblazing crazy-yet-superrefined a/v for years now. A welcome meeting, he’s a true original.

What attracts you to Quartz Composer?

If you look at a modern Mac desktop running Motion, you soon realise we’ve reached some kind of threshold in the development of all this realtime stuff: we can proverbially vj with after effects. Translating that to the realities of what you need as a performer, Vidvox’s VDMX combined with Quartz Composer seems the dream ticket. Still in beta, and with an interface that is yet far from streamlined, it does the magic trick of handing you the keys to the studio, where every bit of kit is free. Want another preview monitor? There you do sir. And if there’s some visual trick or bit of interactivity it can’t do, chances are you can make it yourself in quartz composer and it will load in as if it were coded by Vidvox themselves. At the high-end, thats pretty empowering. And if your needs are more specific still, you can take your “plug-in” QC knowledge and make native Mac apps yourself with a bare minimum of code, or if you’re willing to take the plunge (and its well worth it), then you’re extending QC itself with custom coded plug-ins or partnering QC based rendering engines with bespoke interfaces. If you’re on a PC and feel the ninja-fu, go immerse yourself in the world of VVVV. You won’t have the system-wide integration enabling things like VJ apps using it for plug-ins, but you’ll get a much richer environment to build your own castle with.

Video content and improvisational abilities are important for Vjs, but beyond those aspects, what ways have you enjoyed video artists involving themselves in simple or sophisticated ways within events / environments?

The ford project certainly grabs a handle on the future we were promised, where it isn’t just about ever bigger tvs broadcasting ever more channels with ever fancier graphics: its embracing of the audience through user-generated content and face-to-face interactivity really changes the relationship between media and the masses at events. The VJ set that was the most pleasant surprise to see last year was a beautifully simple operation from exyzt, who took a little wireless camera and ran around the clubspace and stage with it, always getting nice motion and feeding it into a framebuffer on a laptop, controlled by a playstation controller. So their performance was the two of them dancing, one with controller and one with camera, sampling and triggering on the fly and wiggling the joysticks to overlay graphics on the action. Fun and a consistent visual flow that fed the club back onto itself in the best way. As exyzt are a bunch of supertalented renegade architects with a string of huge installations and production pedigree to their name, it was doubly interesting to expect some mapped space super production and instead see something so simple. And of course, they hit the same theme of embracing the audience there.

What’d you learn from your AVIT experiences, and how do you feel about the global network of VJs today?

AVIT marked the moment in time when VJing transitioned from people-inventing-vjing-in-isolation to VJing being a recognised term and vjs being networked up in their home towns and beyond. Fuelled by the internet, there was a mounting pressure for VJs to meet each other and actually see VJ practice that wasn’t their own, and avit was one of the main releases for that: it started as the physical spin-off or incarnation of the then-new and skyrocketing vjforums.com. In the UK, three years after our first event we produced a week long symposium that really hit home to us that we’d met our objectives and the vj world was established: the work was good, the networks were in place, organisations were forming and taking up the baton. So now, for me, the focus has to be delivering on the potential of VJ practice, which means groundbreaking works, which means putting rocket boosters on interesting projects and talented people. Who and how, that’s an interesting project, and a continuing one. [posted by Sean Healy aka Jean Poole on Sky Noise]


Mar 14, 2008
Trackback URL

Leave a comment

Interviews

Current interview:
Robin Meier, Ali Momeni and the sound of insects

Previous Interviews:

Tags


livestage music sound performance calls + opps installation audio/visual radio festival instrument networked audio interactive experimental electronic workshop video participatory writings event mobile exhibition concert live collaboration electroacoustic environment nature reblog distributed soundscape field recording net_music_weekly improvisation software history locative media space public noise recording immersion voice acoustic sonification lecture generative conference body tool sound sculpture net art art + science VJ/DJ light diy remix site-specific perception mapping film visualization listening laptop algorithmic multimedia city urban data wearable architecture open source game virtual biotechnology sound walk spatialization webcast hacktivism robotic image score platform electromagnetic new media cinema ecology found news composer telematic interface streaming residency interviews/other sensor dance circuit bending synesthesia physical political notation intervention object controller broadcasts conversation narrative second life responsive mashup place technology ambient social network symposium motion tracking hybrid intermedia augmented spoken word livecoding text phonography auralization acousmatic upgrade! gesture opera aesthetics mixed reality resource theory processing 8bit orchestra nmr_commission wireless device toy wireless network theater web 2.0 presentation community surveillance p2p 3D copyright soundtrack research podcast sample feedback psychogeography social chance interdisciplinary tactile recycle interview language systems code emergence presence cassette privacy free/libre software media play chiptune newsletter place-specific archives avatar education haptics activist surround sound audio tour glitch hardware tactical identity bioart asynchronous business tv tangible composition animation jazz transmission arts apps tag e-literature collective microsound relational synchronous Artificial Intelligence conductor convergence reuse simulation ubiquitous synthesizers im/material
3D 8bit acousmatic acoustic activist aesthetics algorithmic ambient animation apps architecture archives art + science Artificial Intelligence asynchronous audio audio/visual audio tour augmented auralization avatar bioart biotechnology body broadcasts business calls + opps cassette chance chiptune cinema circuit bending city code collaboration collective community composer composition concert conductor conference controller convergence conversation copyright dance data distributed diy e-literature ecology education electroacoustic electromagnetic electronic emergence environment event exhibition experimental feedback festival field recording film found free/libre software game generative gesture glitch hacktivism haptics hardware history hybrid identity im/material image immersion improvisation installation instrument interactive interdisciplinary interface intermedia intervention interview interviews/other jazz language laptop lecture light listening live livecoding livestage locative media mapping mashup media microsound mixed reality mobile motion tracking multimedia music narrative nature net art networked net_music_weekly new media news newsletter nmr_commission noise notation object open source opera orchestra p2p participatory perception performance phonography physical place place-specific platform play podcast political presence presentation privacy processing psychogeography public radio reblog recording recycle relational remix research residency resource responsive reuse robotic sample score second life sensor simulation site-specific social social network software sonification sound soundscape sound sculpture soundtrack sound walk space spatialization spoken word streaming surround sound surveillance symposium synchronous synesthesia synthesizers systems tactical tactile tag tangible technology telematic text theater theory tool toy transmission arts tv ubiquitous upgrade! urban video virtual visualization VJ/DJ voice wearable web 2.0 webcast wireless device wireless network workshop writings

Archives

2017

Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2016

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2015

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2014

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2013

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2012

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2011

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2010

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2009

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2008

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2007

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2006

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2005

Dec | Nov | Oct | Sep | Aug | Jul
Jun | May | Apr | Mar | Feb | Jan

2004

Dec | Nov | Oct | Sep | Aug | Jul

What is this?

Networked_Music_Review (NMR) is a research blog that focuses on emerging networked musical explorations.

Read more...

NMR Commissions

NMR commissioned the following artists to create new sound art works. More...
More NMR Commissions

Net_Music_Weekly

"Two Trains" by Data-Driven DJ aka Brian Foo

Two Trains: Sonification of Income Inequality on the NYC Subway by Data-Driven DJ aka Brian Foo: The goal of this song is to emulate a ride on the New York City Subway's 2 Train ... Read more
Previous N_M_Weeklies

Bloggers

Guest Bloggers:

F.Y.I.

Feed2Mobile
Massachusetts Cultural Council
networked_performance
Networked: a (networked_book) about (networked_art)
New American Radio
New Radio and Performing Arts, Inc.
New York State Council on the Arts, a State agency
New York State Music Fund
Turbulence
Upgrade! Boston

Turbulence Works