May 21, 2007
070707 UpStage Festival
Shadow puppets, flights of fancy, air guitar and a visit to a London building site will be some of the virtual attractions at 070707 UpStage Festival - a feast of online performances on July 7, 2007 to celebrate the release of UpStage 2.
New Zealand and international artists are creating work specifically for the UpStage environment, which will be performed for an online audiences and simultaneously screened at the New Zealand Film Archive in Wellington.
UpStage is software that allows audiences from anywhere in the world to participate in live online performances, created in real time by remote players. Audiences need only an internet connection and web browser and can interact through a text chat tool while the players use images to create visual scenes, and operate "avatars" - graphical characters that speak aloud and move.
The diversity of proposals for the festival has impressed the organisers. "It's exciting to see UpStage being used in such a variety of ways," said UpStage project manager Helen Varley Jamieson. "We have all manner of artists - writers, musicians, dancers, performers, videographers, story-tellers - experimenting with how they can use the internet as a creative medium and a site for their work."
The full list of performances and artists is on the UpStage web site. Performance times will be publicised on the UpStage and New Zealand Film Archive web sites soon, and live links to the stages will be accessible from the UpStage web site on July 7; online audiences just need to click!
The performances will be screened live in the the New Zealand Film Archive mediagallery where visitors can buy a coffee, take a seat and watch the performances taking place from remote locations around the world. Exhibitions Manager Mark Williams says "It will be like watching a live movie, as the shows unfold in front our eyes."
UpStage workshop facilitator Vicki Smith has been providing graphic, technical and tutorial support for artists and education groups who are creating performances, and says that the level and range of work being produced promises breathtaking cyberformances (online performances) for audiences to view and take part in.
UpStage 2 is funded by the Community Partnership Fund of the NZ Government's Digital Strategy, with the support of partners CityLink, MediaLab and Auckland University of Technology, and developed by programmer and digital artist Douglas Bagnall.
The launch takes place on 28 June and will be accompanied by an exhibition at the NZ Film Archive from 28 June to 15 July, and the festival on 7 July.
For further information and images, contact:
Helen Varley Jamieson: helen[at]upstage.org.nz
Vicki Smith: vicki[at]upstage.org.nz http://upstage.org.nz/blog/
November 27, 2006
Open Walk-Thru and ADA Swaray
The last UpStage open walk-thru for 2006 will be combined with the last ADA Swaray for the year, and held on Wednesday December 6 at 9:00 pm NZ time. To join the online festivities, just point your browser here at party time (we have some surprises up our cyber-sleeves!). To find your local time, go here. If you'd like to log in and play, email me for a username.
October 31, 2006
A Tele-Immersive Collaboration
Synthecology combines the possibilities of tele-immersive collaboration with a new architecture for virtual reality sound immersion to create a environment where musicians from all locations can interactively perform and create sonic environments.
Compose, sculpt, and improvise with other musicians and artists in an ephemeral garden of sonic lifeforms. Synthecology invites visitors in this digitally fertile space to create a musical sculpture of sythesized tones and sound samples provided by web inhabitants. Upon entering the garden, each participant can pluck contributed sounds from the air and plant them, wander the garden playing their own improvisation or collaborate with other participants to create/author a new composition.
As each new 'seed' is planted and grown, sculpted and played, this garden becomes both a musical instrument and a composition to be shared with the rest of the network. Every inhabitant creates, not just as an individual composer shaping their own themes, but as a collaborator in real time who is able to improvise new soundscapes in the garden by cooperating with other avatars from diverse geographical locations.
Virtual participants are fully immersed in the garden landscape through the use of passive stereoscopic technology and spatialized audio to create a networked tele-immersive environment where all inhabitants can collaborate, socialize and play. Guests from across the globe are similarly embodied as avatars through out this environment, each experiencing the audio and visual presence of the others.
Participants from the WWW use a browser interface to contribute sound elements to the garden environment for use as compositional items. All the while, this real-time composition is streamed through web broadcast of the virtual environment to illustrate the audio-visual transformation of the garden. Broadcast throughout the entirety of the festival, Synthecology will celebrate the possibilities of collaboration, improvisation, and distributed authorship that exist on the horizon of an increasingly interconnected world.
As current advances in networking become commonplace, the creation of collaborative environments connecting remote individuals will become less involved. By augmenting the possibilities for users to share sensory presence through tele-immersive interfaces, Applied Interactives intends to combine the possibilities of real-time collaboration and socialization with the dynamics of digital creation and manipulation. Synthecology is a speculative glance at how the technology of today may be utilized to create new autonomous zones for sampling & re-mixing culture.
Synthecology is being created as a collaboration of students and faculty from the Electronic Visualization Laboratory at the University of Illinois at Chicago, The School of the Art Institute of Chicago, and Columbia College Chicago, and art(n) through the Applied Interactives organization.
ABOUT APPLIED INTERACTIVES
The purpose of Applied Interactives, NFP is to educate the art and science community about the medium of Virtual Reality as an interactive, computer-generated, immersive computer graphics environment. Applied Interactives, NFP plans to advance the medium through research and experimentation as well as provide a bridge to bring the technology out of institutional labs and into more publicly accessible arenas. Applied Interactives, NFP intends to propagate the medium by providing support and direct access to the resources necessary for artists and scientists to exhibit and develop works in the medium.
October 30, 2006
Upstage Walk Through
Try out UpStage 2
The next open walk-through will be this coming Wednesday, 1 November, at 8pm NZ time. Convert to your local time here. We will be on the Swaray stage, so audience members should come directly here; if you'd like to log in and participate as a player, it's imperative that you email me for a log-in as it's now impossible for more than one person to log in with the same username, so we need to make sure you aren't all trying to use the same guest log-in!
We now have the alpha version of UpStage 2 running on our server, so this walk-through is your first chance to try out the new features and enhanced interface (and help us identify bugs - this is the ALPHA version after all!). We're extremely grateful to the AUT student team - Beau Hardy, Francis Palmer, Lucy Chu & Wise Wang - who have been working on UpStage all year, and also our wonderful programmer Douglas Bagnall who is back on board to integrate the students work & continue with other developments for UpStage 2.
Hope to see you online on Wednesday,
helen : )
PS - Wednesday is also Leena's birthday - she's immersed in her current project (the interactive TV series "Accidental Lovers") so can't join us for this walk-through, but we'll definitely be drinking a virtual toast to her virtual birthday in her virtual presence/absence ...
October 09, 2006
Jason Van Anden
Friday, October 13, 2006
7:30 and 10:00pm*
Brooklyn new media artist Jason Van Anden is unleashing his new interactive sound composition software IntelligentDesigner (I.D.), through an audience participation event at Williamsburg's A/V mecca MonkeyTown. I.D. allows the user to manipulate the visual composition of an array of colorful bubbles, each with an assigned sound. It's elegant minimalist videogame interface enables a new kind of colloboration between artist and audience. As the user alters the visual field, they alter the order in which the sounds are initiated, creating a living song of their own creation.
At the MonkeyTown event four interactive kiosks will allow the audience to manipulate touch-screen versions of I.D. While participating artists will give the audience the sound palette, it will be up to the audience to construct these fragments into structures. As the four kiosks will be wired to enormous independent video screens, the audience will be submerged in the visual equivalent of the form they've chosen the compositions to live in.
An impressive slew of eleven local and international sound artists were invited to take this exciting new software for a test spin thanks to the excellent ears of musician Nat Hawks of Little Fury Things. Contributing artists include:
Leafcutter John (UK-renowned sound artist turned folk troubadour)
Datach'I (NYC-coveted post-jungle annihilist)
Nullsleep (NYC-world's favorite gameboy protégé)
Nonhorse (NYC-abstract tape collagist from the Vanishing Voice)
Lucky Dragons (Providence-tie died laptops of progressive flute cut-ups)
Jason Forrest (Berlin-propulsive prog math-up prince)
CloudlandCanyon (Germany-psychedelic casios bleed through digital fog)
Lullatone (Japan-heartbreaking tiny sounds)
Christian Science Minotaur (NYC-pastoral electro-acoustic)
Our Brother The Native (Michigan-fuzzy, blissful howling now on FatCat)
Flying (NYC- fun folk frenzied with pots and pans)
Local participating artists will be on hand at the event.
Compiled is curated by Nat Hawks for LittleFuryThings records.
58 N. 3rd (btw. Wythe and Kent) L to Bedford
TIME: (2 identical sets) 7:30pm and 10pm
PRICE: $10 **
*reservations strongly suggested (can be made easily through venue website)
** admission includes a data CD containing the IntelligentDesigner software and interactive compositions from participating artists.
September 19, 2006
a platform for Live Visuals and Experimental Video
As a platform for Live Visuals and Experimental Video, initiated and run by Marcus Wendt and Vera Glahn, Sendung serves an on- and offline community via its virtual gallery and web features as well as events and workshops in real time and space.
Sendung brings artists and designers from different fields, countries and levels together to discuss and develop their work. Posted comments, critiques and reviews intend to help raising a serious discourse on this very hybrid art genre. We pay special attention on the maintenance of the site and the project´s network to make Sendung both an enjoyable showcase and a useful tool for artists and designers as well as curators, producers and facilitators. Visit us from time to time and be sure to find some inspiration!
Join In: Find latest contributions on the frontpage gallery; browse the Archive for more and make use of the advanced search to find topics, contributors, and countries by tags. Get your own selection as a Videocast and never miss a new upload by your favorite artist or in a certain genre. Post your own videos and make Sendung a stage for your own work! Recordings of big event performances and bedroom jamsessions are as welcome as non-live-experiments. Registration and Upload for Sendung are foolproof!
September 04, 2006
A Social, User-Centric Space
There were two great talks at the Pixelspaces panel. One of them was by Dan Phiffer (US) and Mushon Zer-Aviv (IL) who presented ShiftSpace, a project they developed at the ITP in New York (honorary mention in the PrixArs competition). I've got no image for it because my camera broke :-(
The first question they asked themselves was “Is there a public space on the web?” Then they demonstrated that the web is not really a public space.
First example they gave to illustrate their point was Christophe Bruno’s online work: The Google Adwords Happening. He tried to answer the question: “how do you make money from net.art?” that had popped up on the rhizome mailinglist. He decided that he’d start by losing money. So he bought some advertisement slots on google. They were not related to the key word he had bought the space for and were not mentioning any URL.
Many people saw his work. But after some time, he received a letter from Google saying something like “We think that you can do your work better, here’s a few tips: put the keyword on the title; add a clear description of your website and of course, please add the URL of your website. Bruno ignored the letter and left the ads as they were. He got another letter from google threatening him of taking down his adverts if he didn’t conform to the rules. The ads went off and Bruno’s attempt to create public art on the web with the pages of Google ended there. His example shows that google is a very private space.
Just like shopping malls...
Defenition of public space from wikipedia: “A public space or a public place is a place where anyone has a right to come without being excluded because of economic or social conditions, although this may not always be the case. One of the earliest examples of public spaces are commons. For example, no fees or paid tickets are required for entry, nor are the entrants discriminated based on background. Non-government-owned malls are examples of private space with the appearance of public space.”
A personal anecdote from Dan: in California there’s a fairly famous website run by a guy called Rob Cockerham who makes pranks: cockeyed.com. He wrote a text on his website asking his readers to follow him in a shopping mall, stalk him as he shops and walks, take pictures of him and behave like papparazzi. But two picture takers were warned by a security guard that unless their cameras were put away they’d be escorted out. When they asked: But isn’t the mall a public space? They were told that, no, in fact a mall is a private property. Shopping malls are like the web, they have the appearance of a public space but they are private.
What is Shiftspace?
If you google “falun gong” on google.com you’ll get a different result than on google.cn because Falun Dafa is censored in China. ShiftSpace adds a note on the Chinese google results that says “Please note that these results have been censored. The un-censored top results should be falundafa.org.”
The artists then showed how they annotated the ars electronica website, hacking its motif. Another example showed a banner on myspace that said that it was Ruppert Murdoch’s space (he bought Myspace), so is it still “your” space?
How does ShiftSpace work?
You browse the net as usual and when a thingy note pops up, you’re informed that the website has ShiftSpace annotations on it. You can filter the notes. For example, decide to see only the notes written by your friends; you can notify ShiftSpace when a note is in fact Spam. The developers also got inspired by the digg system and ShiftSpace allows tho shift up or down a note, according to its interest.
Brief history of the Metaweb.
In 1999 Thirdvoice.com allowed web users to add annotations on a webpage. In 2001, Third Voice had to close. Many lawsuits against them.
In 2004, Greasemonkey let hackers inject code into a page to update it or add new features to the website.
If we compare Shiftspace, Greasemonkey and 3rd Voice, we get:
- Third Voice was a social system; it allowed just one application, was user-centric, it was a proprietary code;
- Grease Monkey wasn’t social, it allowed for various applications, there was no aggregation and it was Open code;
- ShiftSpace is social, it allows for various applications, is user-centric, has an aggregation interface and his code is open.
The tool is very approachable and it allows for people who are not familiar with the code to join the fun.
Now Shiftspace needs your help. How can you participate?
Use the platform and file in bug reports; help develop the core platform. The development heads are IF APIs; Data management; social network and reputation systems; privacy.
Join the mailing list on community.shiftspace.org.
On Tuesday, everyone is invited to the Installation Party at the electrolobby, Dan Phiffer and Mushon Zer-Aviv will install ShiftSpace on your machine. It takes 30 seconds.
August 07, 2006
The open walk-through is a chance for you to learn about the UpStage environment and to play around in it with others. The next UpStage walk-thru will be held this coming Wednesday, 9 August, at the following times: California, USA: 2am; New York, USA: 5am; UK: 10am; Western Europe: 11am; Finland: 12am; Australia - NSW: 7pm; NZ: 9pm. Check here or here for your local time.
Audience members, point your browser at http://upstage.org.nz:8081/stages/presentation. If you want to participate as a player, email Helen Varley Jamieson for a log in--helen[at]creative-catalyst.com
June 23, 2006
Individually Reactive Space
PICAMOTICS is conceived as an individual public navigation system in which an individual's presence and movement in a space is interpreted and responded to by the system. The resulting “individually reactive space” knows its visitors. Tracking their positions it can follow movements and draw conclusions from individual behaviours.
Picomotics is also referred to as a "collaborative platform for interactive surroundings". In addition to navigation, proposed applications include educational scenarios where individuals are assigned attributes which, when they interact with others, demonstrate abstract concepts in a playful and social manner.
For example, "visitors represent basic mathematic operations such as division, multiplication etc. by walking through a field of numbers. If, for example, the visitor representing an addition of 8 meets the visitor representing a division by 2, the result of their meeting will be 4. Abstract laws of mathematics become tangible."
The system uses infrared camera tracking from above. The floor becomes a trackpad and display. Audio "sound showers" can also be supported. ipunkt (second image) is a similar responsive navigation system with a bit more depth on the development process.
June 22, 2006
Pace of Place
Pace of Place is a "portable, nomadic portal" that creates an "emergent performance of actions and data" of re-sampled activity in urban public wireless hot spots.
A net-enabled, webcam equipped laptop tracks motion at the site and extracts one vertical line of pixels from the image of moving bodies/motion. Custom software both accumulates and explodes the pixel lines, generating an image frame every minute and a 1 minute video loop of sampled time and activity every 30 image frames. The loops from different spaces/places capture and abstract the activity/pace relationship of the spaces.
Marlon Barrios Solano continues the project in residence at The Aesthetic Technologies Lab at Ohio State University and blogs on the residency here where he documents his practice and shares his process.
June 15, 2006
Month Of Sundays Live A/V Internet Mixing - 25th June
Two Sites, Two Cities
FurtherNoise.org Presents: Month Of Sundays Live A/V Internet Mixing. Featuring John Kannenberg & Glenn Bach. Open Mix led by Ruth Catlow & Marc Garrett (Furtherfield & HTTP). Post performance soundscapes by Alex Young (Furthernoise). Date & Time: 16.00 - 18.00 hrs BST; Where: E:vent - 96 Teesdale Street, London E2 6PU.
As part of the Month Of Sundays series of live A/V internet performances Furthernoise.org is hosting this unique event featuring a cross continent A/V performance by Chicago based John Kannenberg mixing in real time with Glenn Bach who will be performing from his home in Long Beach, California. The performance is based on their Two Cities project, which began in 2003 using sounds, photos, objects and data collected on Glenn and John's daily walking commutes to compare and contrast the environments of their respective hometowns.
It will take place in the online file mixing platform Visitors Studio
and projected, amplified into the gallery space from www.visitorsstudio.org
Come and join us at E:vent: Bring your laptops and media files and collaborate. Following the performance, Furtherfield artists Ruth Catlow & Marc Garrett will lead an open mix where audiences both online and in the gallery can join in by uploading and mixing their own audio & visual files in an open collaborative mix. Files can be mp3, swf, flv and jpg and must be a maximum of 2OOK.
There will also be free refreshments and post performance Soundcapes by Alex Young who's album 'Helicoids' is the new net release on Furthernoise.org.
As well as being shown at E:vent, the afternoons performances will be also broadcast, in real-time, online:- at The Watershed Media Centre, Bristol. The Point CDC Theatre, New York.
Curated by Roger Mills. Furthernoise & Visitors Studio are Furtherfield.org projects, supported by Arts Council England.
Chicago-based sonic and visual artist John Kannenberg works with a variety of themes including primal natural forces, spirituality and mindful contemplation, melancholy and nostalgia, abstracted narrative tales, and the confluence of sonic and visual art. His major appearances include the Spark Festival 2006 (Minneapolis), so.cal.sonic 2005 (Long Beach), ISEA 2004 (Tallinn), and the Placard Festival 2003 (New York). John is the creator and curator of Stasisfield.com, an experimental music label and digital art space presenting works by a diverse collection of artists from around the globe.
Based in Long Beach, California, Glenn Bach is an active multidisiciplinary artist influenced by the act of mindful walking and environmental sound, Bach has performed at Field Effects (San Francisco), the Big Sur Experimental Music Festival, and the Schick Art Gallery (Saratoga Springs, NY) and has curated a house concert series, Quiet (2003), the week-long so.cal.sonic festival (2005) and is the founder of the research group Pedestrian Culture. His current project is a poem sequence, Atlas Peripatetic, inspired by an extensive mapping of sounds on his morning walk.
Ruth Catlow is an artist and works as co director of Furtherfield, formed and run in partnership with artist, Marc Garrett since 1997. Ruth works with networked media in public physical spaces and on the Internet. exploring net art with new communities (of artists and audiences) with less reliance on existing, traditional art world hierarchies, developing independent grass-roots expression and representation. She is exploring the potential of current network technology for promoting distributed creativity which raises a whole series of issues by giving rise to a more permeable boundary between established arbiters of culture, artists and audiences radically changing the life of the artwork in the world, and the ways in which people come across it.
Marc Garrett is an Internet artist, writer, street artist, activist, curator, educationalist and musician. In a constant state of being renascent. He share's no allegiance to any one form of art or expression. 'For me, art, or rather creativity, is an intuitive strategy that involves learning, questioning, progressive thought and putting playful explorations into action'. Emerging in the late 80's from the streets exploring creativity via agit-art tactics, Marc declares his own and humanity's seemingly perpetual dysfunction. Consciously using unofficial platforms such as the streets, pirate radio, net broadcasts, BBS systems, performance, intervention, events, pamphlets, warehouses and gallery spaces. In the early nineties he was co-sysop with Heath Bunting for Cybercafe BBS.
June 08, 2006
Avatar Body Collision and the Aotearoa Digital Arts
The second ADA Swaray takes place this Sunday, June 11, 9pm NZ time. Join members of Avatar Body Collision and the Aotearoa Digital Arts network for virtual cocktails, fine frocks, flowing conversation and a few surprises.
The ADA Swaray is a social meeting space for digital artists and other interested people, in the online performance environment UpStage. All you need to participate is a browser with the Flash Player plug-in. Check http://www.worldtimeserver.com for your local time.
June 02, 2006
Next UpStage Walk-Through and Swaray
Online Performance Environment
The next open walk-through in UpStage will be on Wednesday 7 June at the following times: California, USA: 2am; New York, USA: 5am; UK: 10am; Western Europe: 11am; Finland: 12am; Australia - NSW: 7pm; NZ: 9pm. Check http://www.worldtimeserver.com for your local time.
The open walk-through is a chance for you to learn about the UpStage environment and to play around in it with others. To participate, point your browser at http://upstage.org.nz:8081/stages/presentation at the appointed time; if you'd like to log in & learn how to use UpStage, or if you already know how & want to come & have a play around, please email me for a log in: helen[at]creative-catalyst.com
Another date for your diary is Sunday 11 June, 9pm NZ time: the second ADA Swaray will take place, with more mischief, mayhem and maybe even a little morsel of discussion ... put on your best frock and join us at http://upstage.org.nz:8084/stages/swaray on sunday.
May 10, 2006
Extensions via Virtual World of Art
In Blind Love (click on Art), two visitors must find each other before they can find the way out of a pitch-black labyrinth. Each is armed with a weapon that shoots luminous particles. The particles explode on the walls and cascade into the corners, revealing the skeletal outline of the labyrinth's architecture. The two visitors must use the light emitted by their weapons to find one another, but-in an inversion of normal game logic-must not hit and kill each other if they hope to win. Like lovers, the visitors must blindly trust each other in order to survive, and must approach one another despite the danger.
Blind Love is one of the art works created for EXTENSION (click on Virtual World of Art, Extensions/SAT), a virtual architectural intervention that explores the potential of digital architecture to reveal and transform urban, cultural, and spatial identities. EXTENSION was created for the SAT in Montreal, one of North America's leading centres for new media. Visitors entering EXTENSION through the SAT terminals find themselves in a virtual space that mirrors their actual physical surroundings.
As they move through the environment, they cross from reality into virtuality, and from simulation into representation. Within the virtual environment, the existing building of the SAT has been recreated as a realistic 3D model. This simulation is transformed and reconfigured with the addition of a vast, zeppelin-shaped structure affixed to the roof. From inside this glass-encased space, visitors have a panoramic view of a virtual Montreal, where the city itself has been reconstructed with a poetic blend of realism and utopian fiction. Visitors to this near-real space discover experimental artworks in the halls and rooms of the virtual environment, and layered in other dimensions accessible through portals that brings the viewer into dedicated art installations. Anchoring virtual experience in real space, EXTENSION blurs the boundaries between imagination and perception. Grounded in the real but not limited to the realistic, EXTENSION is a reflection of the concepts and visions each of us forms about spaces and places.
EXTENSION is one of three nodes of the Virtual World of Art.
As technologically mediated experiences the Internet, games, cinema, television occupy ever more of our time and energy, it becomes increasingly vital that we create alternative spaces and environments for experience within these borderless territories of information and communication. Virtual World of Art invokes, manipulates and transforms the vocabulary and logic of game culture to create artistic engaging spaces which awaken aesthetic, emotional, social and intellectual responses radically different from those engendered by contemporary mass media. The project explores a variety of significant problems that cultural institutions committed to artistic practice within a networked electronic environment continue to face: how to create esthetically compelling and emotional online experiences, how to link physical and data spaces together, and how to effectively enable human communication and exchange in a physical space where local and remote visitors and performers can communicate virtually through technologies and computer interfaces.
Virtual World of Art investigates the artistic possibilities of immersive game technology and proposes new models for experiencing architectural and public spaces, sociability, and cultural production. Grounded in reality but not limited to realism, Virtual World of Art forms a dynamic, socially relevant, independent alternative to the existing art world, enabling artist to emancipate and develop a critical view of the future and meaning of contemporary art.
Virtual World of Art is the title for a series of new media art projects which subvert and reconfigure multi player game technology to create a network of artistic virtual environments. Each of these virtual environments called nodes, containing digital artworks and virtual art installations, is associated both conceptually and thematically with a specific site, arts centre or a public event where the project is presented on a long-term basis. These virtual environments are connected together by the internet forming a new kind of enlarged social and experiential public space for artistic expression and social exchange.
VWA will connect at least 3 locations or nodes in order to function as a complete project. Each node explores a different notion of hybrid between physical and virtual public space. Visitors will be able to interact through terminals installed at the nodes or at any public space in the world that has a high-speed internet connection.
Virtual World of Art is a project of Workspace Unlimited.
May 05, 2006
Aotearoa Digital Arts + UpStage
You're invited to the first ADA Swaray, on Sunday 7 May at 9pm New Zealand time.
What is the ADA Swaray? Aotearoa Digital Arts, in conjunction with UpStage, is hosting a series of "swarays" - informal social gatherings for digital artists and other interested folk to meet, talk and play online. The idea was born following a demonstration of UpStage at last year's ADA symposium, _emerge_, partly as an opportunity to get to know artists coming to SCANZ but also as an informal networking opportunity for digital artists in New Zealand and overseas.
How do I attend? Put on your favourite frock and point your browser at http://www.aotearoadigitalarts.org.nz/swaray; use the link to check your local time if you're not in NZ, and at the appointed time follow the link to the Swaray Stage. Type into the text chat window to join in the conversations and mingle with the assembled guests. Virtual champagne, cocktails and canapes will be served.
May 04, 2006
An online environment that allows artists, designers, photoshop junkies, pixel pushers, collage artists & photographers to collaboratively design & edit a single image. In kollabor8, each image chain is an open digital image mutation collaboration, displayed like threads in a forum. Each link in the chain is in some way a derivative of the previous image, as designers iteratively add aesthetic features in a collage-like way. See also gridlove & swarmsketch. [blogged on information aesthetics] [Related: Imprimatur]
April 25, 2006
Recursive video in Second Life
Had our first meeting inside of Second Life today to talk about the island. Thanks to everyone who showed up. Special thanks to Aimee Weber who donated an amphitheater! She showed me how to play videos inside of Second Life. So... I tried making a very recursive video of me watching a video of me watching a test video of myself. Watching a video together inside of Second Life actually works well. The audio and video quality is excellent and you can chat about the video and other things while you watch. It's really neat sharing a space like this together...
"...Business Week (via Terranova) has a podcast, two slideshows and a story about Second Life and other MMORLG, money and advertising. Good wrap-up if you're interested in the topic but don't have the time to follow it closely (check also The Future of Credit Cards - Earning virtual currency for spending in the real world & other world bridging, by Phillip Torrone.)
Anshe Chung - the "virtual Trump" - even gets the cover of the mag. The land development business, which the avatar has built from nothing two years ago, has turned into an operation of 17 people. Second Life participants pay Linden dollars, the game's currency, to rent or buy virtual homesteads from Chung so they have a place to build and show off their creations. They can then convert the play money into dollars by using their credit card at online currency exchanges. To handle rampant growth, Chung opened a 10-person studio and office in Wuhan, China. Says Chung's owner: "This virtual role-playing economy is so strong that it now has to import skill and services from the real-world economy."..." [from Regine's post on we-make-money-not-art]
April 22, 2006
Portuguese Festival of Performance and Visual Art
Welcome Goodbye Adeus Obrigada: Journeys dislocations and imaginary nations
Welcome Goodbye Adeus Obrigada brings together artists that use journeys and imaginary places as the departure for their work. Their projects share references to tourism, migration and new geographies, encouraging speculations about the existence of a third space: a hybrid territory developing from transnational encounters and cross-border dialogues embedded in the fabric of day to day life.
Their work takes place in and around the Blue Elephant Theatre in Camberwell and Little Portugal, the name given to Stockwell's thriving Portuguese in the heart of south London. The programme brings together Portuguese artists living in Lisbon and the UK for the first time.
Welcome Goodbye Adeus Obrigada: Journeys dislocations and imaginary nations--Portuguese Festival of Performance and Visual Art; 3 May to 30 June
Blue Elephant Theatre, 59a Betwhin Rd Camberwell SE5 0XT.
March 08, 2006
Expanded Opportunities for Arts and Performance
"I spoke with Ann Doyle this morning. Ann is the manager for Arts and Humanities Initiatives at Internet2, a consortium of universities, industries and government that are developing and deploying advanced networking applications and technologies. (Beth Miklavcic and Jimmy Miklavcic, whom I interviewed yesterday about their InterPlay performance, use the Internet2 for their distributed programs).
You should definitely visit the link to the Arts and Humanities Initiatives that I just mentioned above. There are some fascinating resources about a range of distributed arts programs. I happened to come across an interview (PDF) with James Oliverio, director of the Digital Worlds Institute at the University of Florida, whom I'll be interviewing tomorrow about their dance and performance programs that unite multiple locations around the globe.
I first contacted Ann Doyle because I wanted to learn more about the "Cultivating Communities" dance program that she hosted for Internet2 in 2002. If you visit "Cultivating Communities," you can learn about a series of dance performances that brought together dancers from multiple locations using motion tracking, motion capture and other interactive technologies in conjunction with the near TV broadcast quality of the Internet2 infrastructure - you'll also find a number of videos for these performances.
Ann believes that there are two important ways that the Internet2 project contributes to fostering new types of artistic possibilities and performance opportunities.
First, the network infrastructure delivers near broadcast quality video and audio to participating sites. Plus, there is very low latency, which means that the "roundtrip interactive time," as Ann says, is virtually nil. With top-notch video and low-latency, there are opportunities for distributed spontaneity that really didn't exist before when collaborating with remote colleagues and artists.
Second, this network infrastructure provides an opportunity for dancers and other artists to "think digitally." For example, if you go back to the "Cultivating Communities," a new choreographic question arises. Choreographers now have to think beyond the fixed borders of a single stage and consider their work in the broader context of multiple locations joined together by a digital network. These types of distributed programs raise many questions about the choreographic process and dance in general.
They also raise the question of how dancers and other artists are being prepared for distributed programs at the college and university level. Ann mentioned the Manhattan School of Music (MSM), which has been a leader in leveraging the Internet2 for conducting master classes and performances. Here's an article by Christianne Orto about the educational programs at MSM. Orto is the director, recording and distance learning at MSM and you can read an overview of their Distance Learning programs. Ann also directed me to dance programs at universities that are using Internet2 to conduct performances and classes - I'll be following-up soon with these contacts." [blogged by Doug Fox on Great Dance Weblog] [Related post: Interview with Another Language About InterPlay (image above)]
March 03, 2006
A project by Marc Garrett (director, creative), Neil Jenkins (director, creative & tech), Ruth Catlow (director, creative) and Furtherfield in collaboration with Arts Council England (London) (funder).
VisitorsStudio is an open, multi-user, online arena for creative 'many-to-many' dialogue and networked performance. Through simple and accessible facilities, the web-based interface allows users to collage and manipulate their own and others’ audio-visual files, and to imaginatively recontextualise existing media. VisitorsStudio provides a platform for explorations of collective creativity for both established artists and those excluded from traditional art structures, for reasons of geography or social circumstance.
In March 2006, Furtherfield will launch VisitorsStudio Version2, which incorporates new artistic tools and community building facilities, allowing users to schedule and promote their own performance programmes. These can be recorded, archived, downloaded and redistributed as screensavers. 25th-26th March "VisitorsStudio" @ Idea Store, Chrisp Street [a NODE.London event]
Participants upload content, image and sound files (JPG, MP3, SWF) to a shared database, responding to each other’s compositions in real time. Individuals can also chat with each other and are identified within the online space by their own dancing-cursors.
VisitorsStudio is a Furtherfield project collaboratively developed by artists, programmers, critics and curators, with significant contributions by audiences new to net art, and members of online art and technology forums. This platform continues to grow organically in response to their participation.
February 27, 2006
The performance is called "Indigenous Maniacs" and was inspired by material presented at the congress, including Liz Bryce's MFA Master's project "Becoming Indigenous: an impossible necessity". Liz's work explores concepts inherited by Pakeha (white) New Zealanders through the desires of their colonial ancestors, and speculates on the paradox - to "become" indigenous. "Indigenous Maniacs" was created during 4 3-hour workshop sessions, and when it was presented at the conference we were all in the same room together - so it will be a new challenge to perform it with everyone in different locations, some on dial-up, and no physically present audience.
We'll meet in the Introduction Stage at the following times, then proceed to the Indigenous Maniacs stage: USA California: 12am; USA New York: 3am; UK: 8am; Western Europe: 9am; Finland: 10am; Australia NSW: 7pm; NZ: 9pm. For other local times, check http://www.worldtimeserver.com/.
Point your browser at http://upstage.org.nz:8081/stages/presentation; there's no need to log in for the performance (in fact we'd prefer it if you didn't), but if you'd like to log in and play afterwards, please email me for a log-in.
UpStage is a web-based venue for live performance; in this creative online environment, multiple players in remote locations work together to compile avatars, images, text, speech and web cams for real-time digital story-telling.
February 08, 2006
First Monday, Special Issue #4:
Urban Screens: Discovering the Potential of Outdoor Screens for Urban Society, Pieter Boeder, Geert Lovink, Sabine Niederer, and Mirjam Struppek, eds. Table of Contents: Introduction: Discovering the potential of outdoor screens for urban society by Pieter Boeder and Mirjam Struppek; Urban screens: The beginning of a universal visual culture by Paul Martin Lester; The politics of public space in the media city by Scott McQuire; The poetics of urban media surfaces by Lev Manovich; Interpreting urban screens; by Anthony Auerbach; Story space: A theoretical grounding for the new urban annotation by Rekha Murthy; The urban incubator: (De)constructive (re)presentation of heterotopian spatiality and virtual image(ries) by Wael Salah Fahmi; Urban screens: Towards the convergence of architecture and audiovisual media by Tore Slaatta; Towards an integrated architectural media space by Ava Fatah gen. Schieck; Art and social displays in the branding of the city: Token screens or opportunities for difference? by Julia Nevárez; Hijacking the urban screen: Trends in outdoor advertising and predictions for the use of video art and urban screens by Raina Kumra; For an aesthetics of transmission by Giselle Beiguelman; Intelligent skin: Real virtual by Vera Bühlmann; Programming video art for urban screens in public space by Kate Taylor; Augmenting the City with Urban Screens by Florian Resatsch, Daniel Michelis, Corina Weber, and Thomas Schildhauer.
Welcome, gentle reader, to this First Monday Urban Screens special issue, the first publication of its kind. With the advent of digital media, the global communication environment has changed dramatically. In the context of the rapidly evolving commercial information sphere of our cities, especially since the 1990s, a number of novel digital display technologies have been introduced into the urban landscape. This transformation has intersected with other major transformations of media technology and culture over the last two decades: the formation of distributed global networks and the emergence of mobile media platforms such as mobile phones. Their cumulative and synergistic impact has been profound. Convergence of screen technologies with digital communication technologies such as GSM, RFID, Internet and database technologies has lead to the emergence of a new, interactive and increasingly pervasive medium: Urban Screens.
Urban Screens can be defined as interactive, dynamic digital information displays in urban environments. Their genesis is the consequence of two parallel technological developments: evolution and subsequent growth in magnitude of the traditional display screen, and its subsequent convergence with other digital media technologies. Forms and appearances range from large daylight compatible LED billboards, plasma or SED screens, information displays in public transportation systems and electronic city information terminals to dynamic, intelligent surfaces that may be fully integrated into architectural façade structures. Their introduction in the urban environment poses new, unparalleled challenges and opportunities, which we will explore
and document in this issue.
Currently, the primary purpose of this new infrastructure appears to be the management and control of consumer behaviour through advertising. Commercial companies are starting to realise that digital billboards are a powerful medium to communicate their goals and missions, in line with the new paradigms of the digital economy. Interconnected Urban Screens have tremendous potential to serve as a platform for information exchange. Such large networks are already being developed Russia, China, USA and South America, where Urban Screens are rapidly becoming a key element in commercial and government informational infrastructure. The implications for the public sphere are profound. Information density per square metre is increasing, yet at the same time individuals have less control than ever over the actual format and content of that information.
Public space has always been a place for human interaction, a unique arena for the exchange of rituals and communication. Its architecture, being a storytelling medium itself, plays an important role in providing a stage for this interaction. The ways in which public space is inhabited can be read as a participatory process of its audience. Its (vanishing) role as a space for social and symbolic discourse has often been discussed in urban sociology. Modernisation, the growing independence of place and time and individualisation seem to devastate traditional city life and its social rhythm. The Urban Screens project explores the opportunities for opening this steadily growing infrastructure of digital screens, currently dominated by market forces, for cultural content, along with its potential for revitalising of the public sphere.
Urban Screens 2005 was the first international conference that was solely dedicated to the emerging Urban Screens phenomenon. Presentations covered a broad spectrum of topics and issues, ranging from critical theory to project experiences by researchers and practitioners in the field of art, architecture, urban studies and digital culture. It addressed the growing infrastructure of large digital moving displays, which increasingly influence and structure the visual sphere of our public spaces. Urban Screens 2005 investigated how the currently dominating commercial use of these screens can be broadened and culturally curated: can these screens become a tool to contribute to a lively urban society, involving its audience interactively?
A new medium that is digital, interactive and pervasive
What we are seeing is the emergence of a new medium that is digital, global and local, interactive and pervasive at the same time. What happens if the convergence of new technologies such as Internet, database and mobile technologies suddenly enable interactive access to the visual streaming of these digital surfaces? Can it revitalise the public sphere by creating an information-dense urban environment or is it a major threat? How does the growing infrastructure of digital displays influence the perception of the visual sphere of our public spaces? Metaphorically speaking, can or do Urban Screens already function as a mirror, reflecting the public sphere?
The Urban Screens project aims to address these questions in a transdisciplinary debate and present new approaches to answering the most pushing urgent questions, exchange experiences and create and maintain a thematical network around the subject for initiating future collaborations. The Urban Screens 2005 conference in Amsterdam addressed the existing commercial predetermination and explored the nuance between art, interventions and entertainment to stimulate a lively culture. Other key issues were mediated interaction, content, participation of the local community, possible restrictions due to technical limits, and the incorporation of screens in the architecture of our urban landscape.
Urban Screens 2006: Demonstrating the potential of public screens for interaction
Building upon the results of Urban Screens 2005, the 2006 Urban Screens 2006 conference (Berlin, October 5-6) will elaborate on the discussion and develop the broad spectrum of possible formats and usage of this emerging new media infrastructure. Urban Screens 2006 will be a platform for demonstrating the potential of public screens for interaction in a trinity of infrastructure, content and cooperation models. Interconnected topics will be the politics of public space, multimedia content as a service for an array of portable devices, urban neighbourhood reactivation, interaction design of urban screens, standardisation and integration in the urban landscape. Using existing screens infrastructure as well as future 'Urban Screens furniture' in the urban space of Berlin, we will demonstrate the impact of Urban Screens, their contextualisation and situatedness. This unique accumulation of projects will serve as a playground and research field for practical observations on the interplay of screen technology, content, location and format.
Urban Screens 2007: Expanding the potential of content for community
Urban Screens 2007 is currently under preparation in collaboration with BBC Public Space Broadcasting. While Urban Screens 2006 will have 'brick & mortar' accents, Urban Screens 2007 will have a distinct focus on the potential of journalistic content: issues surrounding the production and display of media content for Urban Screens, as well as adaptive reuse of 'old' content for new media will be explored in detail. Key issues and topics will include Public Space Broadcasting (PSB), the politics of public space, mediated interaction and participation, as well as experiments with new participatory formats. PSB can energise the hearts of cities by bringing together communities to share events and broadcasts, creating public news and information points that double as local meeting places. Largely due to the innovative work of the BBC, PSB is starting to prove its potential to provide an outlet for community and educational activities, public service information, visual arts, digital innovation and local content production, revitalising the public sphere.
We hope that you will share our excitement.
January 30, 2006
Avatars Among Us
Breathing a Second Life into the Keynote
"DEMOCRACY ISLAND -- There are certainly more glamorous ways to spend a Friday night than watching an animated version of a computer engineer discuss approaches to harnessing collective intelligence. But, for more than 25 people who showed up here last week to hear computing pioneer Douglas Engelbart address a Silicon Valley futurist group, the promise of intellectual stimulation prevailed.
In exchange for insights on the concept of the dynamic knowledge repository, audience members watched a speech delivered by a gray-haired, suit-and-tie clad avatar bearing a more than passable resemblance to the man best known for inventing the computer mouse. The talk, held at a virtual locale known as Democracy Island in the multiplayer online world Second Life, drew a less realistic audience. Some in the crowd weren't even human, sporting features like antennae, fur and wings..." From Avatars Among Us by Joanna Glasner, Wired.
Related: Lessig in Second Life
as well as Giving a PowerPoint Presentation in Second Life.
January 11, 2006
Contemporary Society and Genomic Culture
In order to find its autonomy, Bio Art (a term described by Jens Hauser, curator of the "L'Art Biotech" exhibition, as an "etymological disgrace") is passing through the necessary evolutionary phases towards a complete definition, conquering a conceptual autonomy independent from the means used. The image that comes to mind is that of a 'chimera', a hybrid creature that is a mix of different species, which expresses a peculiar compositive coherence, both paradoxical and efficient. The facets implied by dirtying one's own hands with the basic elements of organic material (genes, cells, proteins, etc.) are many, as is analyzing the female body as a contemporary technological laboratory (in the performances by the subROSA collective), conceptually challenging the current eugenic development models, or the application of the usual schemes of knowledge hoarding, as is done by Eugene Thacker with his Open Source DNA.
The ferment created by the Critical Art Ensemble and Eduardo Kac can now be recognized by filtering the media flow of relevant announce(ments) by the industry, made here by Ricardo Dominguez, or reconstructed through the reflections on clones and their mediatic reproductions by Birgit Richard. This way, it's possible to feel the pulse of the silent conflict surrounding these subjects. In times when human intervention on organic creatures is as big as what nature does in decades, and building life is a fact, the definition of shared critical positions becomes the social fulcrum the cultural development of these technologies revolves around. (edited and curated by) Dmitry Bulatov, The National Publishing House "Yantarny Skaz" ISBN 5740608537 [via NEURAL]
January 10, 2006
The the first UpStage walk-thru of 2006 will be held this coming Wednesday, 11 January, at the following times: California, USA: 12am; New York; UK: 8am; Western Europe: 9am; Finland: 10am; Australia - NSW: 7pm; NZ: 9pm. Check here or here for your local time.
Audience members, point your browser at http://upstage.org.nz:8081/stages/presentation. If you want to participate as a player, email helen varley jamieson for a log in--helen[at]creative-catalyst.com
January 04, 2006
PING GENIUS LOCI
Pixels Sense the Presence of People
PING GENIUS LOCI is a programmable space installation, a new member in the induction house series, developed by aether architecture with Bengt Sjölén. Based largely on the research thread developed during the previous installations, version 4 tries to achieve the programmable matter aspect of the induction house series on a much larger scale, with more direct interaction scenarios.
We have developed self sustainable analogue pixels that work outdoors in the sunshine, using solar power and radio communication, and can sense the presence of people. We are now producing 400 of these sensor pixels to create a large interactive field, that functions both as a sensor field and a large outdoor screen. At the same time interaction scenarios, games and functions are being developed, to open up architecture to its use. [via [blogged by Regine on we-make-money-not-art]
January 02, 2006
Pure Data networked jam sessions
PD (acronym of Pure Data) has recently emerged amongst the many software devoted to real time sampling and audio/video streaming, mostly thanks to its flexibility during live performances. It's a real time coding environment suitable for video, audio and graphic editing. Roman Haefeli has developed an environment made for facilitating electronic musicians' jam sessions on a network basing on PD. It's a client-server system, so it works on any network (internet included), and its name, NetPD, derives from this feature. But this is not intended as a platform for creating sounds, but as an environment where every client (i.e. every computer connected to a NetPD server) can share its music patches. The most interesting part is that the same patches can be played through NetPD, and this implemented feature triggers the jam sessions, welcoming all the different contributions. A further peculiarity is that you can't share sound files (even if they are embedded in a patch). On one hand this makes samples sharing impossible, but on the other hand it handles the real innovative significance of the generative music. [Vito Campanelli, neural]
November 23, 2005
4th Telematic Forum
Call For Papers: IS - Internet e Storia (Internet and History)
Call For Papers: IS - Internet e Storia (Internet and History) 4th Forum 15 January - 15 March 2006.
The topics for the 4th telematic forum are Internet, History and all subjects related to the application of multimedia technology to History and, in general to Human Studies. Each speaker presents an abstract through the specifically prepared mailing-list. Essays will be placed on the official site, which could be consulted by signed members. Acts will be published in «Storiadelmondo» ISSN 1721-0216, public access electronic journal dealing with world-wide History and Human Studies. Papers will be published also in CD-ROM edition.
Submission Information: Speakers participate on invitation or by self-candidature through the official site pages. Website (English, Italiano, Español) Submissions in electronic form (DOC; RTF; TXT) are strongly preferred. Scholars at all stages of their careers are equally welcome (required short bio-bibliography). The candidature must arrive within the January 9, 2006 for papers and/or review of: books, websites, softwares.
For application instructions and further information about the IS - Internet
e Storia, contact:
Medioevo Italiano Project
V.le O. Sinigaglia, 48
I-00143 Rome, Italy
October 27, 2005
UpStage! This Week
JeanRichard + Open Session
It's a busy week this week in UpStage: on Saturday 29 October we have a live public performance by JeanRichard; and on Wednesday 2 November we have the regular Open Session and walk-through.
LIFE2 performance by JeanRichard: This is the first public performance to be staged in UpStage by a group other than Avatar Body Collision (the creators of UpStage). JeanRichard is a Swiss family of artists who began using UpStage a few months ago, and they love it so much they've already reached the point where they are ready to give their first public performance. [Related]
To join the performance, you don't need to be logged in; just go to http://jeanrichard.ch/Life2/ where there will be a live link to the stage from one hour before the performance time. And the performance time is: 1am California; 4am New York; 9am UK; 10am Switzerland; 11am Finland; 6pm Australia (NSW/Queensland); 9pm New Zealand. See http://www.worldtimeserver.com/ for other local times.
UpStage Open Walk-Through: This session takes place on the first Wednesday of the month and is an opportunity for newcomers to learn about UpStage and for regulars to have an open jam session. To register for the session and request a log-in, email me (helen[at]creative-catalyst.com) - even if you're a regular, please let us know that you're coming.
As usual, we'll begin on the Introduction stage, http://upstage.org.nz:8081/stages/presentation. Depending on numbers, some may go to another stage for improv while newbies learn the basics on the Introduction stage. Those of you who are logging in will do so from http://upstage.org.nz:8081
The times this month are: 1am California; 4am New York; 9am UK; 10am Netherlands; 11am Finland; 7pm Queensland Australia; 8pm NSW Australia; 10pm NZ
See http://www.worldtimeserver.com/ for other local times; please note that European clocks have changed between these two events - if you are in any doubt over the correct time, find your local corresponding time for the Swiss time of 10am for the JeanRichard performance on 29 October, and for the New Zealand time of 10pm for the open session on 2 November.
October 17, 2005
JeanRichard-family, life, art
LIFE2[at]SPACE is a live performance by Family JeanRichard, taking place in UpStage on Saturday 29 October at 10am MEZ. It will feature 3 works of JeanRichard; 4 avatars; 5 props. Check your local time at http://www.worldtimeserver.com. One hour before the performance starts, there will be a live link to the stage from http://jeanrichard.ch/life2.
JeanRichard work as family and show their processes and products on their weblog. The weblog represents the art project itself, but serves also as a gallery where posts are sold online, and we are up to build up a kind of museum (other artists/curators etc show works that are related to JeanRichard). See Baby Blogging.
September 29, 2005
Next UpStage Walk-thru, October 7
Kia ora koutou: this is a reminder about the monthly UpStage walkthru's [the next one is next week!]--Wed 5 October: California, USA: 2pm; New York, USA: 5pm; UK: 10pm; Western Europe: 11pm; Finland: 12 midnight; Australia (NSW/QLD): 7am [thurs]; NZ: 10am [thurs]; or check -http://www.timeanddate.com [for your local time].
The format will be newbies in the introduction stage [over view of the tools and tricks in the UpStage performative space]. While other crew will be devising with members of Avatar body Collision [please come to the introduction stage first for directions :)]; To take your place as a player please email colliders[at]avatarbodycollision.org for a login and password. Audience members please proceed to http://upstage.org.nz:8081/stages/presentation.
September 12, 2005
Digital Turin for the 2006 Olympics
How do urban dynamics change in the digital era? What tools does technology give us to aid the study and design of the territory? A concrete answer to these questions comes from the project by a Turin group of architects and new media experts. Maurizio Cilli, Carlo Infante, Riccardo Mantelli, Filippo Moncelli, Max Paccagnella and Stefano Ruggeri have created glocalmap.to, a digital map of the Turin Olympic area, developed for the Cultural Olympics in collaboration with TOROC, the Games' organising committee.
Global and local, glocalmap.to is an interactive web platform open to inhabitants and tourists alike; for the 42 days of the Winter Olympics in February and March 2006, they will be asked to say what they think of their city. It works quite simply: those wishing to do so can send a message (sms, mms and e-mails with attachments), which will be stored and displayed at a precise point on the map, visible to all via a number of screens. Apart from the play factor, glocalmap.to is intended to represent an experience of participation in urban planning.
For the future they are considering a vectorial version that will allow architects to place their projects directly on the territory. The aim? To trace a new spontaneous map of Turin that will reveal fresh and unexpected geographies. The presentation of the project is scheduled for 6pm today at the Atrium foundation. The speakers are Giovanni Ferrero, Maurizio Cilli, Stefano Boeri and Paolo Verri coordinated by Carlo Infante. Footnote: The Beach club, Turin's urban beach on the River Po, is celebrating its fifth birthday and hosting an open meeting of the Domus editorial team from 8pm onwards.
September 06, 2005
One Story, Multiple Viewpoints
In The Wallpaper, various elements of a visual and aural scene (humans, chairs, desks, ambient sounds) are recorded and stored separately from each other in a computer and then recombined in real-time during playback according to instructions in a script. Representing the scene in this way allows the creator of this piece to experiment with changing camera angles and shot selection, swapping characters, and modifying acoustical characteristics based on audience interaction in order to express different subjective points of view of the same story. The piece is also responsive to passive circumstances of its delivery, by showing more or fewer cuts and close-ups depending on the size and shape of the viewing window.
These two MPEG movies   are two of many possible playouts of the production. One presents the story from John's subjective point of view, and the other presents Kathy's point of view. You will notice differences in shot composition, close-up placement, backgrounds, ambient sounds, and acoustics. The system is also capable of blending these two "extremes" to produce versions that mix the two story perspectives to different extents. In the actual piece, the viewer has real-time control over the subjective story perspective and the camera position. The Isis object-based media prototyping environment was used to script the entire presentation.
September 01, 2005
Open Session & Improv: September 7
The next open session in UpStage will take place on Wednesday, September 7 at the times below. This time we will divide into two groups, with a beginners' session happening on the Introduction stage while those who've already learned the basics will go to another stage to improvise a short performance for an hour. In the second hour, we will invite everyone to watch and interact with the
Wednesday 7 September: California: 2am; New York: 5am; UK: 10am; Western Europe: 11am; Finland: 12 noon; Australia (NSW/QLD): 7pm; Aotearoa NZ: 9pm. Check http://www.worldtimeserver.com/ for your local times. If you want to come and learn, email me for a log-in. come to http://upstage.org.nz:8081/ to log in, then follow the link in the top right hand corner to the Introduction stage.
If you want to come & improvise a performance, you still need to email me (in case we run into double-ups with guest log-ins) & we'll also meet at the introduction stage, then move to another stage to work. I'm proposing the theme of "water" as a starting point (given recent floods in Europe & the global problem of the growing lack of fresh water in the world) so you can think about backdrops & avatars that relate to that theme. If you upload any graphics, assign them to the Virtual Tourist stage & we'll use this one for the improv.
If you'd just like to come to watch the performance, you don't need to log in. Come to the Introduction stage towards the end of the first hour of the session, & when the performers are ready you will be led through to another stage.
Any questions, email me, & i hope to see you next week.
helen varley jamieson: helen[at]creative-catalyst.com
August 19, 2005
Cyberformance Bug Party
we'll use the introduction stage as our starting and meeting place, and the aim of the party - apart from having a good time - is to identify and prioritise bugs, and hopefully allocate some to people who can work on them.
if you want to join us, email helen varley jamieson: helen[at]creative-catalyst.com; creative catalyst .
This year, the Eclectic Tech Carnival took place from 11-15 July in Graz, Austria. During the festival, UpStage workshops were taught remotely by Helen Varley Jamieson in New Zealand. Participants created their own avatars and backdrops, and improvised a short cyberformance.
A theme of the festival was open source chat and streaming video. In the screen grab at top left, Helen's web cam appears in UpStage with an overlay of the live web cam feed from Graz.
August 18, 2005
Capturing and Annotating Media both Visually and Verbally
Video Traces was conceived and designed by Dr. Reed Stevens of the College of Education and has been developed and studied in collaboration with PETTT. Video Traces is a system that makes it easy to capture a piece of rich digital media, such as video or a digital image, and to annotate that media both visually (using a pointer to record gestures) and verbally. The resulting product is a "video trace": a piece of media plus its annotation--in essence, a recorded "show & tell". Traces can be viewed by their creator, exchanged with others, and further annotated for a variety of teaching and learning purposes. Video Traces provides a unique opportunity to capture embodied knowledge and educational interactions by supporting the most common ways people communicate their ideas--through talking, showing, and pointing.
The Video Traces project serves PETTT goals in a number of ways:
· Explore the interplay of technology and pedagogy in real settings: We have collaborated with individuals both within and outside of the University, and have explored educational uses of Video Traces in settings such as dance studios, architectural sites, and science museums.
· Facilitate thoughtful and innovative educational technology uses: We have found that using Video Traces prompts reflection on the part of both learners and instructors, and in several cases has inspired instructors to adopt innovative new strategies for teaching their courses and for assessing student learning.
· Make strong connections between research, design, and practice: We have used our observations about how people use Video Traces and our interviews with learners and instructors to inform the redesign of the software and generate further questions for research.
Video Traces: Rich Media Annotations for Teaching and Learning by Reed Stevens, Gina Cherry, and Janice Fournier
August 07, 2005
3D Theater on a Table
m@terials, developed by Jitsuro Mase, use an off-the-shelf LCD projector to display a "3D theater" on a table. A video clip available on the Digital Stadium website. It's amazing that this actually works because the hardware device looks deceptively simple. On the table top are transparent plastic strips standing diagonally at 45 degrees. The plastic strips make images stand up. The virtual people can be projected on to a strip closer to you or the ones further away from you.
Digital contents that can be effectively displayed on this device should conform to specific rules, however, "it is not so difficult" according to the creator of this device. This device could be used for many different kinds of things besides the "3D theater" especially if it could be made larger (possibly as big as a computer displays or a building floor?) with hi-fidelity realistic images. [blogged by manekineko on we-make-money-not]
August 04, 2005
Virtual Playa Project
The VIRTUAL PLAYA PROJECT is a navigable 3d digital Burningman environment using Microsoft Flight Simulator as a platform. It is intended to be an open-ended project that invites participation at various levels. It can be downloaded for home use; played on a giant screen at a Burningman event, or even be used as a design tool for a theme camp or artist wishing to plan an installation before it ever gets to Black Rock City.
The ultimate wish for the project however, is for the Virtual Playa to be the Burningman Cyber Regional. Using multi player technology, it can become a portal through which we can meet on line, and share experience with other cyber burners from anywhere in the world in real time. This takes the project from just being a cool piece of collaborative digital art, to a true meeting place for the cyber-tribe. Download it for free, copy it, send it to pals, leave it on buses, give it away as a gift.....spread the word.
July 04, 2005
Next Walk-Through Session: July 6
Yes folks, it's that time of the month again: the next UpStage Walk-Through will be held on Wednesday 6 July at the following times: California: 1am; Ontario: 4am; New York: 5am; UK: 9am; Western Europe; Finland: 11am; Sydney: 6pm; NZ: 8pm. Check http://www.worldtimeserver.com for your local time. To participate, please email helen[at]upstage.org.nz for a username to log in.
NOTE: If you have previously attended a walk-through using one of the guest log-ins, you still need to email me for a new log-in before turning up as we reissue the guest log-ins, so you could end up trying to log in with the same one as someone else. Log in at http://upstage.org.nz:8081/; you will arrive in the workshop. At the right is a link to the Introduction stage, and this
is where we'll meet for the walk-through. If you want to join us as an audience member, come directly to: http://upstage.org.nz:8081/stages/presentation.
June 28, 2005
Jam this Wednesday
PitchWeb jam this Wednesday, June 29, from 6-8 pm EDT. The occasion is a book release party Routledge is giving for William Duckworth's "Virtual Music: How the Web Got Wired for Sound." They'll be playing as DJ Tamara and the Laptops. The plan is to make the party a "virtual" experience: the party goers have been invited to bring their laptops, and Nora will be weaving the online band into Tamara's house mix and a webcast.
Hope you can join them online at http://www.pitchweb.net/. All you have to do is click, sign on, join a group, and play along. As with the 12-hour May-Day PitchWeb jam, Wednesday's session will also become source material for the yearlong Deep Time: Songs for Servers project that we're beginning later this year.
About the Book
· Must-reading for all interested in the world of web-based music
· Highlights diverse artists from John Cage to Moby to Scanner
· Includes unique CD sampler highlighting the composers and works discussed in the book
Virtual Music: How the Web Got Wired for Sound is a personal story of how one composer has created new music on the web, a history of interactive music, and a guide for aspiring musicians who want to harness the new creative opportunities offered by web composing.
For Bill Duckworth, the journey began in 1996 when he developed the idea for an interactive webcast, named "Cathedral," which was developed over a period of 5 years. On its completion, "Cathedral" won numerous awards, including the ASCAP/Deems Taylor Award for composition, and has already inspired further experimentation.
But this is more than the story of one composer or one piece of music. The book traces the development of interactive music through the 20th century from Erik Satie through John Cage, Brian Eno, Moby, and Scanner. The technology itself is described as it has inspired experimentation by artists, including composers who have developed new ways to involve the audience in their music, plus possibilities for the non-musically trained to "play the Web." Challenges facing the web composer-from copyright issues to commercialization-are analyzed with new solutions suggested.
Virtual Music is a fascinating story that will appeal to fans of new music, creators, performers, and anyone interested in how technology is transforming the arts.
May 27, 2005
UpStage Open Walk-Through
Next Session: June 1/2
The next open session in UpStage will happen on Wednesday, June 1 (or Thursday, June 2 depending on time zones) at the following times: 2pm June 1 California; 5pm New York; 10pm UK; 11pm Western Europe; 12 midnight Finland; 7am June 2 Sydney; 9am June 2 NZ
Check http://www.worldtimeserver.com for your local time. If you want to join us as an audience member, come directly to: http://upstage.org.nz:8081/stages/presentation.
If you want to log in and play, email me first for a username and password, then go to: http://upstage.org.nz:8081 and log in.
May 25, 2005
Alternative Realities in Networked Environments
A Comprehensive Medium for Exploring Emotions
The Alternative Realities in Networked Environments [ALTERNE] working group has completed the project and the results are being transferred to a new web site. The ALTERNE project objective was to construct an Alternative Reality platform to support the development of digital, interactive and participatory artistic activities. We define Alternative Reality as a generalization of Virtual and Mixed Realities beyond the common "space-based" simulation. The aim of the technical development is to extend the current techniques of Mixed Reality to support the more advanced experimentation with reality and virtuality that are required by the process of artistic creation. While traditional Virtual Reality essentially addresses the construction of visually realistic synthetic worlds, ALTERNE supports additional layers that will make it possible to explore other concepts such as: causality, relations between time and space, alternative laws of physics, alternative life forms, etc., in a more radical fashion.
ALTERNE platform brings a high level of integration between the current techniques supporting Mixed Reality, graphics, interaction and behavioural models. It promotes the artistic development and explorations of various forms of perception of spatiotemporal orders by placing the human body and its digital surrogates at the centre of the aesthetic considerations. One example is to provide a more comprehensive medium for the exploration of emotions, from the movements of the human body to the socialized interpretations of action.
Certain tools developed by ALTERNE will be made available over its web site and the MARCEL site as well. Some network tools are already posted for downloading.
Information Society Technologies: ALTERNE is a multi-annual research and development project supported by European Commission within Cross Programme Action 15 (Technology Platforms for Cultural & Arts Creative Expressions) of DG Information Society (reference nr. IST-2001-38575).
May 17, 2005
Distributed Immersive Performance
Real-time, Multi-Site Performance
The Integrated Media System Center (University of Southern California) is working on the architecture, technology and experimental applications of a real-time, multi-site, distributed, interactive and collaborative environment called Distributed Immersive Performance (DIP). The objective of DIP is to develop the technology for live, interactive musical performances in which the participants - subsets of musicians, the conductor and the audience - are in different physical locations and are interconnected by very high fidelity multichannel audio and video links. DIP is a specific realization of broader immersive technology - the creation of the complete aural and visual ambience that places a person or a group of people in a virtual space where they can experience events occurring at a remote site or communicate naturally regardless of their location.
May 09, 2005
3D Navigable Platform Exhibition and Workshops
LAb[au] is happy to invite you to Liquid Space 01+02 exhibition + lqs03 workshop taking place at Brakke Grond, Amsterdam_ the Netherlands. Liquid Space is a series of artistic workshops LAb[au] is setting up with different cultural institutions to design spatial audiovisuals with a specific focus on collaborative and shared processes resulting in installations, exhibitions and performances.
Here, the space navigable music platform--a 3D engine developed by LAb[au]--is proposed as the starting-point for development and exchange to the invited artists. The engine is based on the principle of integrating different media in a structural, programmed manner, inside and through electronic space navigation. An environment where the performer navigates his created 3D space to compose music in real time, displayed in a 360° projection space and a quadra-phonic sound system.
..lqs01: deSIGNforms _ Nabi Art Center,Seoul
..lqs02: deSIGNing by numbers MediaRuimte, Brussels
..lqs03: deSIGNing feedback loop systems _ Brakke Grond, Amsterdam
..04.05 - 14.05.05: liquid space 03 - workshop
..10.05 _ 20.00 h: liquid space 01+02 exhibition opening + presentation
..11.05 - 14.05 _ 10.00 - 24.00 h: exhibition *
..14.05 _ 20.00 h: liquid space 03 closing event, performance
LAb[au] + Eavesdropper, Els Viaene, Petersonic invited by Brakke Grond to perform Exploring the Room in the context of the Liquid Space 03 deSIGNing feedback.loop systems workshop theme and as an opening-event of the Liquid Space 01+02 - exhibition.
Exploring the Room is a performance where music, best defined by the practice of soundscaping, and real time generated computer graphics stand on the same level. Establishing a constant dialog through its particular stage-design, sound and visuals are building the room, a 3.00 x 3.00 x 2.25 meters "cube" made out of projection screens and quadraphonic speaker setting, giving the minimal footprint able to host 3 musicians during 1.00 hour.
Performers and the audience are projecting shadows on the screen-walls, the system is capturing this image and reintroduce it as an overlaid projected image, closing the loop. All acting then is a matter of balance in between black and white, light and shadow, sound and silence, one and zero.
May 05, 2005
Full-Body Interfaces for Linking People
CINE--by Miro Kirov, Houston Riley, and James Tunick--is a new type of networked computing environment that exists in an immersive dynamic space instead of in a PC box. Its displays fill entire walls instead of being confined to small, isolating screens. It is our belief that the mouse and 2D desktop are dated and inadequate. CINE's full-body interfaces will allow richer modes of expression and creative 3D data organization will engage users, inspiring a sense of magic while also making information retrieval and collaboration more efficient.
Similar to the futuristic vision of the *Holodeck, CINE is envisioned as an immersive visualization platform and advanced collaboration space equipped with an intuitive multi-user gesture interface. CINE, however, is not merely an illusionary virtual space, it is a mixed-reality space that augments group experiences by linking people in virtual spaces to people in real spaces. [via ColumnNetwork]
April 28, 2005
Next Two Sessions
The next open session in UpStage will happen on Wednesday 4 May, and we're doing two sessions again, so pick the time that works best for you:
First Session: 5am New York; 9am UK; 11am Western Europe; 12 noon Finland; 9pm NZ. Second Session: 4pm New York; 8pm UK; 10pm Western Europe; 11pm Finland; 8am Thursday 5th NZ.
Check http://www.worldtimeserver.com for your local time. If you want to join us as an audience member, come directly to: http://upstage.org.nz:8081/stages/presentation.
If you want to log in and play, email me first for a username and password, then go to: http://upstage.org.nz:8081 and log in.
April 26, 2005
Video Conferencing Software as a Performance Medium
Is There No There There?
"ABSTRACT: This paper surveys past performances in which the author collaborated with several other dancers, musicians, and media artists to present synchronized co-located performances at two or more sites. This work grew out of the author's participation in the landmark computer music ensemble, "the HUB". Each of the various performances were made possible by an evolving array of video conferencing hardware and software. These will be discussed. The problems and interesting side effects presented by latency and dropouts are a unique part of this performance practice. Leveraging the concepts of shared space, video and audio feedback generate evolving forms created by the combinations of the space, sounds and movements of the participants. The ubiquity of broadband Internet connections and the integration and constant improvement of video conferencing software in modern operating systems, makes this unique mode of performance and essential area of research and development in new media performance." From Video Conferencing Software as a Performance Medium by Scot Gresham-Lancaster.
April 01, 2005
Walk-through: Wednesday 6 April
A platform for online performance, theatre and storytelling, UpStage is a web-based venue and tool for artists to compile different digital media for textual and audiovisual communication into a live performance, in real time, for online audiences. The walk-throughs, which usually last around 1-2 hours, give you hands-on experience of UpStage and how you can use it as a creative online tool. The next open walk-through in UpStage will be held on Wednesday 6 April, at 6pm New Zealand time (check here for your local time). There is a possibility of having a second session at a more civilized hour for people in Europe - if you're interested please email vhelen(at)creative-catalyst.com.
If you want to log in and play, email vhelen(at)creative-catalyst.com for a username and password, then go to: http://upstage.org.nz:8081 and log in. to join the audience, go to http://upstage.org.nz:8081/stages/presentation.
March 29, 2005
A Platform for Networked Multimedia Performance
"Abstract: Kromozone is a networked, interactive/intra-active, computer-based realtime performance system. All elements are interconnected via ethernet interfaced within the Max/MSP/Nato environment. This allows for information at any station to be passed to any other station for control, monitoring, reinterpretation, and processing. An audio station’s parameters can effect the manipulation of video at another station for example. This creates not only interaction between performers, but intra-action where performers can actually get ‘inside’ each other’s instruments and directly affect the output. In some performances, stations are placed throughout the audience, who is invited to participate. In using the KromoZone performance system, we have explored the implications of networked performance, the use of alternative sources for signal and control input, the design and use of non-standard performance interfaces, and the inclusion of live-manipulated video projections as a viable and integral part of a performance." From KromoZone: A Platform for Networked Multimedia Performance by Stephan Moore and Timothy A. Place, University of Missouri, Kansas City.
March 14, 2005
MARCEL is a permanent broadband interactive network and web site dedicated to experimentation and research across the fields of art, science, education, technology and industry.
The network is made up of research centres, media labs, museums, arts organisations and arts practitioners who facilitate research, projects and collaborations which make use of high bandwidth networks - looking towards a future where fully interactive, virtual online experiences are as commonplace and user-friendly as today's relatively static internet.
Jerome Turner is the new researcher for the MARCEL Resources database, which will provide information on MARCEL related fields, events, projects and developments. If you would like to be included in the database, or suggest relevant material, go to MARCEL to find out more or email jerometurner(at)dsl.pipex.com
March 09, 2005
From the Renaissance to the Gigabits Networking Age
CITYCLUSTER is a virtual-reality networking matrix, a creative high-tech container with original technological features, navigation, interactivity and graphic and content style. In which multiple environments, ambiences or cities both real and imagined, can be hosted, coexist and be interrelated within themselves through a common, virtual territory, interconnected by high-speed network, enabling remote participants to interact and collaborate in shared environments.
The framework may be expanded, modified, enriched, developed, and produced ad hoc in accordance with the nature and typology of the environment to be incorporated. Visitors, with their own creativity and communicative skills, can become protagonist and/or free citizen: navigate, interact, intervene exchange buildings, objects and ideas and or create their own ideal environment.
March 08, 2005
Glimpses of Unseen Places
A landscape from beyond the edges of the browser window that gives glimpses of unseen places. dotdotdot is constructed using several different motion capture systems and improvised performances creating abstract digital portraits. These animated avatars move and react to players inputs within an online virtual environment.
"dotdotdot provides a good example of the ways in which practitioners using old media and new media can collaborate to produce an ongoing body of work challenging the canons from which each component of the work orginates. Using motion capture, web tools, animation, games engines, sound and movement dotdotdot presents a series of animated interactive vignettes. These can be manipulated in terms of speed, sound, rotation and movement so that the basis of the range of visuals on offer stays the same but also so that the viewer/player can change them to accommodate their own preferences. [via Rhizome]
Each work has an Igloo signature but it allows the viewer to manipulate, play and be creative - after years of artists striving for truly interactive work, dotdotdot knowingly works with the limitations of interactivity in order to give the piece a characteristic style.
The viewer/player is presented with a range of choices involving genre orientated radio stations (you can choose drum & bass through to chart music) and a variety of visuals such as red, dot, bendy and plane. All these permutations and combinations together with the ability to interact with the animated figures in terms of speed and rotation challenge the viewer to think about the visual and audio styles in relation to their own experience of video, sound, animation and movement drawn from a variety of sources in day to day culture.
The project, which has been ongoing, is the product of Ruth Gibson, a dancer who is specifically looking at development of movement in the context of motion capture, and a programmer Bruno Martelli, who is interested in pushing the limits of technology in real physical contexts. The artists draw in expertise and collaborators as and when they need and the quality of the work is enhanced with every reworking. Their animations reflect these concerns with their 3D quality combined with recognisability of the human form in spite of the abstractness of the animated shape." Helen Sloan, Director - SCAN
March 07, 2005
Full-Fledged Multimedia Performance Environment
A new version of the critically-acclaimed network-performance Quintet.net has just been announced. Quintet.net, which was called by computer musician Ian Whalley a highlight of the 2003 ICMC, now features an elegant brushed-metal graphical user interface and many improvements under the hood, such as network jitter compensation.
With its Viewer add-on, the application is a full-fledged multimedia performance environment with real-time notation, microtonal playback capabilities and a suite of authoring tools, the Composition Development Kit. Quintet.net is based on Max/MSP/Jitter and is available for free for both the Macintosh OS X and Windows platforms.
The Hamburg Network Composers' Collective, founded in 2003, is a permanent ensemble for the performance of compositions written or transcribed for Quintet.net.
Quintet.net was featured in the 2004/05 issue on network music of the Neue Zeitschrift für Musik (with CD-ROM) and in the current, February 2005 issue of Leonardo Journal.
More information is provided at http://www.quintet.net (the site was just updated and features background information on several international Quintet.net projects as well as network music performance in general).
February 23, 2005
Seine hohle Form
Collaborating on Interactive Performance Works
Abstract: Composers and choreographers face unique and largely unexplored problems as they collaborate on interactive performance works. Not the least of these problems is settling on schemes for mapping the various parameters of human movement to those possible in the world of sound. The authors' collaborative piece, Seine hohle Form, is used as a case study in the development of effective mapping strategies, focusing on dance gesture to real-time music synthesis. Perceptual correlation of these mapping strategies is stressed, albeit through varying levels of abstraction. Read Seine hohle Form: Artistic Collaboration in an Interactive Dance and Music Performance Environment by Joseph Butch Rovan, Robert Wechsler and Frieder Weiß, Crossings: Electronic Journal of Art and Technology, Issue 1.2
January 24, 2005
Binary Ballistic Ballet
Silent Virtual Dancer
In Michael Saups Binary Ballistic Ballet participants send audio-signals through a microphone to a computer which transforms them into abstract shapes. A dancer then translates this information displayed on a computer monitor into a dance pattern on the stage.
The interactive choreography system "Binary Ballistic Ballet" was developed for the ballet piece "Eidos Telos" by William Forsythe, which was premiered in January 1995 in Frankfurt. The ensemble had already been experimenting with an alphabetical dance- system which was then transformed into movement in 3dimensional space-time. For this production this alphabet needed to be ported onto a computer platform to make it more fluid.
In part 1 and part 3, the computer picks words from the word-database and displays them on several monitors that are only visible to the dancers. As a new word appears, it will be directly influenced by the incoming soundtrack, i. e. the musician on stage can change the appearance of the current word in the following ways:
* depending on frequency and amplitude of the incoming audio-signal, the words can be rotated and translated;
* depending on amplitude, the words can be transformed into more abstract shapes, i. e. the words are interpolated into more DNA-like forms;
* a spectral plot of the audio signal is created;
* lines are used to visualise the history of the system.
The dancers can therefore decide which information they will choose:
* the alphabetical index of the word (red color);
* the color of the word, which is based on musical history;
* the direct meaning of the word and its transformation into dance patterns;
* the movement of the word in computer spacetime;
* the line-history on the display;
* the deformation of the word in space;
* the spectral information of the sound.
The choreography is generally as much as 70 percent predetermined. The remaining 30 percent will be influenced by the computer system, which means that the dancers receive the information from one of the computer monitors and immediately transform it into dance patterns. The setup of the system reacts like a feedback loop between musician, dancer and computer.
In part 2 the computer is used to build "interactive creatures" that also react to incoming sound, for instance the monologue of a performer. Here we also have a "silent virtual dancer" that constantly interpolates between complex geometric shapes and accompanies the dancers on stage reacting to the soundtrack. The resulting graphics are displayed as a part of the stageshow. [via Ars Electronica]
January 11, 2005
New Artistic Practices Meets Performance Art, Art and Science
Code Zebra is a highly interactive interdisciplinary, performance and software system where art meets science. The performances occur at sites around the world and on the web. CZ induces dialogues and debates between science (with an interest in computer and biological science) and arts (including visual art, design, fashion, architecture). Code Zebra consists of fictional flirtation sessions between an artist and a scientist, actual conversations and debates between the arts and sciences. Code Zebra is built as a scaleable performance series that allow venues to plug and play.
Elements of the development of Code Zebra will be captured in video and Internet streams and form part of a significant archive that will be used in performing the work. Users on the site will experience live events via streamed audio, and at times video. Live events will be publicized through the web site. The project draws from performance theory (art, literature, sociology), discourse theory and grammatology, cross-disciplinary research methods, visualization theory and computer science practice, simulation, new media analysis.
The metaphor of "zebra" is at the core of the project because zebra's stripes are reaction/diffusion patterns; a provocative but resolution based way of describing dialogue. Evolutionary theory uses zebras as a case study, and reaction/diffusion has been at the core of this emerging performance series and at the heart of the operations of computer code. Zebras are the unexpected evolutionary protocol. They stand out in a crowd, but induce nausea in lions, their former predators. Above all zebras survive by moving in camouflaged herds.
Performances will take place in various real and simulated locations. The software of Code Zebra also combines panel discussions and debates, live and on-line, chat, simulated conversations and patterns and visualizations of chat or performance art metaphors using software and fictional performances.
CodeZebra OS (Orifice System) Software
In November and December of 2000, a group of leading artist/software developers, streamed media artists, discourse theorists, chat analysts, computer scientists and programmers gathered in San Francisco, led by Sara Diamond and hosted by the ArtsAlliance. Together, they modeled Code Zebra, a software that will analyze and allow people to author on-line chat, video streams, producing visual patterns that allow users to better understand and symbolize their own position within discussions. The software deploys reaction/diffusion patterns from nature, but permits individuals or groups to have a personalized pattern at any point in time. The design retreat includes some of theworld`s leading thinkers in Internet dialogue analysis, streamed media, and pattern creation. It links to the larger Code Zebra project. The software is capable of analyzing all forms of chat, but will be focused on debates and discussions in art and science as a development phase.
The software provides users with the ability to use visualization patterns to locate themselves within an Internet discussion, review their histories of dialogue, enter a deeply moderated or anarchic space that is designated by topic and/or by mode of chat and feel physically located in this space; monitor other chats simultaneously, use physical links to relate ideas; build a personal pattern icon and a personal tale of chats and Internet dialogues that unfolds as a visually beautiful, navigable, shareable 3D and sound movie.
Fear and self-preservation need to be confronted and transcended in the process of enacting cross-disciplinary exchange. This will be so in the performances and is already a part of the software design process. The performances and the software translate constantly, between the semantic meanings of concepts, people's relationships, into visualization new levels of understanding. The software enables agency (you choose topics and people), but its intelligence constantly suggests new possibilities of idea, related concepts and people for you to connect to. While appearing light, beautiful and playful, this project is in fact serious and viral.
The software development workshop may be a good example of the lock-up technique (referred to later), as the artist placed fifteen top-end researchers and coders in a room with her and several other artists. They came up with the chat visualization software that was a complex melding of computational linguistics, social instincts, and simulations of animal and human physical characteristics and evolutionary logic. First and foremost, they combined an emotional system with the anarchic or unconscious dynamics of Internet interaction. Code Zebra is all about process, allowing a series of lenses on the process of dialogue and creativity. While driven by the vision of one artist, it is highly collaborative and hence high risk, requiring the cooperation of different talents and disciplines for its success.
The user enters each chat or dialogue session by dwelling on the surface of the site and then diving into a familiar or seductive pattern. Once inthe pattern one moves through its moving mass and can stay at any point for dialogue. The software monitors styles of discussion through pattern analysis and can impose patterns on certain kinds of dialogues should these go astray or at least suggest these. Patterns can be used to analyses what people are talking to each other, about what, but most profoundly how they are talking. Character scripts based on the reaction diffusion character team (Code Zebra, Os Zealot, etc.) will arrive in at least text, if not visual form, to moderate discussions or suggest changes of mode (e.g. you have a limit of ten words posting for e.g. and it looks like playful leopard spots). This process of swimming through the topography of the site is called orifice systems (OS).
Each user creates their icon on the site, a moniker for HOW they interact. Each chat sessions produces a pattern that they can capture at any point in time. It forms a ring, or layer attached to their icon. Over time, these layers build to create a personal tale. These are three-dimensional forms that the user can enter, fly through, fly around, pull out layersof for reconsideration. Although highly visual (and eventually sonic), drilling down into these results in precise data base information about whoone has spoken to, about what and in what style (aggressive debate, playful banter, formal panel discussion) at what time. The forms operate as 3D movies, luscious and beckoning. The icons sit on the surface of the site, but at any time, users can share these or revisit these alone or accompanied.
January 09, 2005
UpStage Open Walk-Through
The next UpStage open walk-through will be on Wednesday January 12, 2005 at the following times: 1am California; 4am New York; 9am UK; 11am Finland; 8pm Sydney; 10pm New Zealand. Check your local time.
2004 was the full first year for UpStage: the first version of the software was launched on 9 January and immediately put to work in the World X project, a schools exchange between Aotearoa/NZ and the UK. In May, the first cyberformance using UpStage, "DTN2" was presented by Avatar Body Collision at the Machinista Festival, Glasgow. Zagreb Gay Pride participants used UpStage in June to give an online presentation about their march, and it was also used for presentation purposes at the /etc festival in Belgrade in July. In October a one-day online workshop was held for participants in Manchester, UK, with the tutor in Aotearoa NZ. There were several other workshops and presentations during the year - more information and links to various projects are on the web site.
A significant event in December 2004 was the relocation of the UpStage server from MediaLab South Pacific, who kindly provided hosting during the development of the project, to our new sponsors CityLink. As we don't have any funding at the moment, finding a hosting sponsor was vital and we greatly appreciate CityLink coming on board without hesitation. Thanks! : )
Regular open walk-throughs began in October, on the first Wednesday of each month, and many of you have taken the opportunity to have a hands-on experience of UpStage. Everyone needed a bit of a break by the end of the year so the first walk-through for 2005 is taking place on the second Wednesday, which is the 12th - times above or see http://upstage.org.nz:8081. Please email helen @ creative-catalyst.com to register (the upstage email addresses haven't quite got hooked up properly since the server shift...you can't have everything all at once!).
We're enjoying the slower pace of the holiday season, so plans for 2005 are still formulating; we won't make any promises yet but we will keep you posted... meanwhile we wish you a peaceful and creative 2005 and hope to see you at the walk-through on the 12th or in the near future.
helen : ) (helen varley jamieson)
December 27, 2004
Emerging Infrastructures of All (Inter)net Research
Dr. Reinhold Grether's network research | netzwissenschaft site maps the "emerging infrastructures of all (inter)net research endeavours. net.science as an anthropology of connectivity is trying to overcome the constraints of specialist method transfers on net matters. the protuberance of technical networks necessitates a professionalization of human net knowledge. neither the isolation of concepts as in basic research nor the encapsulation of processes as in applied sciences will ever be able to adequately describe the complex autopoiesis of networks. net.science is undoubtedly developing into a scienza nuova of its own right."
December 06, 2004
Urban Mobile Theatre Platform
The TROIA project comprises the development and production of an urban mobile theatre platform by a wide range of leading European theatre-makers, artists, architects, engineers and scientists.
The whole purpose of TROIA is to focus public awareness and discussion on the technologies of political control within the context of a fast developing Europe. It takes the most recent reports of the Manchester based Omega Foundation as its starting point. From this TROIA will develop a futuristic stage, a huge transportable building that will appear in the public forum and engage the public in ways expected and unexpected. Indeed this mobile modular container will function like a latterday Trojan Horse by infiltrating the concious and unconcious precepts of both the visitors and passers by alike. Though it has the appearance of a gift it carries a dangerous and at times subversive content.
The architectural structure is a travelling citizens forum, a hybrid info-box with a spectacular architectural presence. It is designed for the temporary positioning in the centre of Kaunas, Vienna and Prague, cities central to the fault lines of recent European history. Following the projects completion is intended to install TROIA at a well situated permanent site.
Embedded within the architecture is media art and technology that will engage actors and visitors in interaction. The subject matter is communicated by the employment of performers acting as undercover agents provocateurs with a performance based on a script that is individually generated by each host country where TROIA is exhibited, thereby making it as culturally relevant as possible. The actors, not detectable as such, mingle with the crowd and go to see the multi-media event with them. Their main task is to initiate conversations and to infiltrate the public space with information and opinions.
This combination of subtle dissemination and spectacular form should open a field of tension within the visitor that lasts well after the project is finished ... in fact planting a Trojan "seed" in the the viewer that can inform the public discussion on the issues the exhibition raises and thereby contribute to an active development of society in which it appears. [via mobile audience]
November 01, 2004
Software for online Performances
The UpStage walk-through is an opportunity to observe people at play in UpStage and to have a guided walk-through of the tools and ideas behind the software with the people who created it.
The next open walk-through will be on Wednesday 3 November at the following times: California, 2am; New York, 5am; UK, 10am; Helsinki, 12 noon; NZ, 11pm. Find your local time at http://www.worldtimeserver.com/
At this session, there will be 2 groups: one for people who are completely new to UpStage, and another for those of you who already know the basics and want to start experimenting. The groups will meet in separate stages - just follow the links from the foyer, http://upstage.org.nz:8081. There are two ways you can participate: you can either come as a "chatter" (audience member) to observe; or you can log in, get dressed (hold an avatar), and experience the full walk-through.
Chatters need only point their browser at http://upstage.org.nz:8081 at the specified time, and follow the links. You will then see and hear (turn your sound on!) what we're doing, and you can participate by typing in the chat window.
To register for the full walk-through, please reply to this email--helena @ creative-catalyst.com--and you will be sent a log-in. We also recommend that you download the manual (online at http://www.upstage.org.nz/download.html) and have a browse before the session.
October 23, 2004
Touch Through the Wires
Wirefire was an online performance and communications environment. The project, by entropy8zuper's Auriea Harvey and Michaël Samyn, utilized technology they developed to faciliate 'touch through the wires', combining chat, sounds, images, animations and live camera streams to form an interactive, improvisational expression that went beyond words. Wirefire is currently available in RANDOM FIRE/REPLAY VIEW only. [Flash 5 plug-in required.]
Wirefire was LIVE online every Thursday night at Midnight to Friday 1am ( Belgian time). Wirefire was meant to be performed and viewed online but non-virtual Wirefire performances were presented in venues such as the Brooklyn Academy of Music, New York; the Walker Arts Center; Minnesota; GMI screen, London; and Passage44, Brussels.
October 20, 2004
A map larger than the territory
Mapping Flow, Crossed Paths
A Map Larger Than the Territory is a Web application that enables participants to represent their paths across the city using images, texts and sounds. Territory here is not a piece of land enclosed within borders but an interlocking network of lines or ways through. The map materialises and connects individual trajectories.
How does it work? Choose a city and a language. The map shows other people's paths in that city. A button at the right sends you to a blind map where you can add an itinerary of your own. To do so, you must first give it a name, a date and a color. Use the tools provided to locate places on the map and define points on your path. Each time you mark a location, a dialog box opens up for you to identify and describe it. When you have finished marking up your path, you can view the itinerary you have made.
Posted by jo at 10:42 AM
October 14, 2004
Immersive and Interactive Narrative
The iCinema Centre for Interactive Cinema Research, established in 2001, is a joint venture of the College of Fine Arts and the School of Computer Science and Engineering at the University of New South Wales, Sydney. It brings together researchers and postgraduate students in new media, digital cinema, digital aesthetics, film theory, multimedia design, computer science, artificial intelligence and software/hardware engineering.
The iCinema research program focuses on the research and development of a digitally expanded cinema. This includes all forms of the moving image, made visible on any type of screen or in any sort of immersive environment, whose structure is constituted by various methods of narrative coherence. The project is directed by Jeffrey Shaw and Dennis Del Favero.
The Centre has four principal research domains:
Immersive Visualization Systems: The exploration of immersive environments which provide for the collection, integration and display of visual, audio and kinesthetic data.
Distributed Interface Systems: The integration of interface systems with the experience of distributed spatial visualization environments.
Interactive Narrative Systems: The exploration of interactive narrative systems which provide the viewer with the ability to select and edit interaction with a set of visual narrative streams.
Theories of Interactive Digital Narrative Systems: The inquiry into theories of narrative and the organization of units of meaning and experience within the digital domain.
October 08, 2004
Mooning You 24-7
Moon Radio is now previewing their MyTV prototype, a programme of live, broadcast events that provide an opportunity for artists, producers and the Moon Radio community to plug-in a camera, choose a time and broadcast live to a worldwide audience 24-7.
MyTV will be launched online later in 2004. Until then, these performances will be archived and available for viewing at Moon Radio. Moon Radio webTV is a web streaming channel hosting live broadcasts, an archive of diverse films, documented live events, and an active online community of artists, filmmakers, content producers and regular viewers. It began in 2000 as an audio and video web streaming channel for artists to explore live web streaming technology. The project commissioned artists and hosted live events both online and in arts venues around the UK. Moon Radio webTV has developed to focus on building tools for the Moon Radio community. These include hosted forums, profiles of community members, and text messaging services.
October 03, 2004
open sessions in UpStage
Avatar Body Collision announces monthly open sessions in UpStage. The sessions will take place on the first Wednesday of each month, with times & dates on the UpStage foyer: http://upstage.org.nz:8081. Upstage is a web-based venue and tool for artists who wish to compile different digital media into live performances, in real time, for online audiences.
Starting this Wednesday, 6 October, open sessions will be held where you can observe people at play in UpStage and have a guided walk-through the tools and ideas behind the software with the people who created it.
Times for the first session on 6 October are:
New York: 5am
Tel Aviv: 11am
Helsinki: 12 noon
We know these times won't suit everyone; we will do another session at 10.30pm UK time (2.30pm California, 5.30pm New York) ONLY IF there are people who want it. You MUST email if you want to come to the 2nd session otherwise it WON'T happen. We are also open to suggestions for other times that would suit people, for example a weekend time.
There are two ways you can participate in the open session: you can either come as a "chatter" (audience member) and participate or lurk in the chat; or you can log in, get dressed (hold an avatar), and experience the full walk-through.
Chatters need only point their browser at this URL at the specified time: http://upstage.org.nz:8081/stages/virtual-tourist You will then see and hear (turn your sound on!) what is happening. You can participate by typing in the chat window.
To register for the full walk through, please reply to this email -- Helen Varley Jamieson [helen at creative-catalyst.com] -- and you will be sent a log-in. We also recommend that you download the manual (online at http://www.upstage.org.nz/download.html) and have a browse before the session begins.
Any questions, email helen at creative-catalyst.com
September 14, 2004
A new medium for online performance, theatre and storytelling is now in its first release. UpStage is a web-based venue and tool for artists to compile different digital media for textual and audiovisual communication into a live performance, in real time, for online audiences.
The first release of the software was launched on 9 January 2004, and online walk-throughs were held on in February to give people an idea of how UpStage works from the player's perspective. These sessions will be continued on a regular basis, lead by the members of Avatar Body Collision. If you are interested in having a hands-on experience with the software, and participating in live improv sessions, email colliders at upstage.org.nz for further information and to be notified of times. Visit the UpStage foyer, from where you can access a sample stage.
Currently, UpStage is being used for WorldX, a virtual exchange between schools in the UK and New Zealand, and DTN2, the first cyberformance using UpStage, was performed live from the Machinista Festival in Glasgow on Sunday 9 May.
September 07, 2004
Dance and Mutable Media
"I envision the development of UNSTABLELANSCAPE beyond the local performance and gross movements of the dancers towards the use of autonomous streams of data, subtle measurement of biological functions (breathing, heart beat, earth movement, GP systems and distributed performance using the internet)...I would like to explore the augmentation of the system hybridicity and maximizing its bottom-up architecture with the integration of organic tissue such as neurons and muscle cells to the real time aspect of the video-sound-movement continuum (residency at Simbiotica, Australia). Later, I will expand the performance with this bio-digital generativity and with analog/biologically inspired robots. It is a continuation of the aesthetics of emergence."
marlon barrios solano
August 23, 2004
A Transatlantic Collaborative
Multimedia Protest Jam
Dissension Convention will coincide with the Republican Convention in New York. 10 pairs of net/digital artists from the Americas and Europe will create live, online multimedia performances. These will be projected at RNC NODE Postmasters Gallery, as well as by other appointed NY platforms in 'store windows, bars.'
5 days of live mix performances, online in real-time from 29th August-2nd September between 12noon and 6pm (NY time) via Furtherfield's VisitorsStudio.
If you are viewing from home you can visit versions mirrored on the artists' websites (check project URL for updates).
* If you wish to mirror this event on your site please target this file http://www.furtherstudio.org/live/dissensionconvention.swf
Sunday 29th August
4-7pm NY (9-12pm BST) *Maya Kalogera & Marc Garrett*
7-10pm NY (12-3am BST) *Moport.org & Glowlab.org*
Monday 30th August
4-7pm NY (9-12pm BST) *Chris Webb & Sim (Soy.de)*
7-10pm NY (12-3am BST) *Lewis Lacook & Alan Sondheim, Sheila Murphy & Jason Nelson*
Tuesday 31st August
4-7pm NY (9-12pm BST) *Helen Varley Jamieson & Karla Ptacek*
7-10pm NY (12-3am BST) *Joseph and Donna McElroy*
Wednesday 1st September
4-7pm NY (9-12pm BST) *Neil Jenkins & Roger Mills*
7-10pm NY (12-3am BST) *Digitofagia vs. Autolabs*
Thursday 2nd September
4-7pm NY (9-12pm BST) *Michael Szpakowski & Ruth Catlow*
7-10pm NY (12-3am BST) *Ryan Griffis & Mark Cooley*
Postmasters Gallery is creating RNC NODE, a way-station, serving as a physical node of an ad-hock public broadcasting, a system of online, real time protest performances and alternative news actions. All online streams will also be output in local bars and projections from windows. Dissension Convention will be part of this programme.
DissensionConvention is a Furtherfield project.
August 19, 2004
New Moon Radio
Active Ingredient have been creating exciting and innovative projects since 1996. Their work has included: Chemical Garden, “A modular garden built in units containing strange plant life and internet controlled robots”; Ghost Engine, a live online séance; Big Up Yourself, a participatory project working with Galleries of Justice.
Moon Radio, begun in 2000, has hosted many live broadcasts, new films and commissioned projects. All New Moon Radio... marks a new era for the project, reflecting the changing pattern of innovation on the web. Now at the heart of the project is the audience - their web community – interactivity and participation. Using chat rooms, SMS texting, community profiles (…and coming soon artist Blogs) Moon Radio aims to become a space that is alive with debate, activity and new projects, reflecting the audiences needs and desires.
August 14, 2004
Topological Media Lab
Responsive Media/Expressive Instruments
The Topological Media Lab provides a locus for studying gesture and materials from phenomenological, social and computational perspectives. TML research invents responsive media and expressive instruments that support novel technologies of performance and the architecture of hybrid media spaces. The products of the laboratory are (1) scholarly presentations, (2) media artifacts and performances as pieces of cultural experiment, (3) opportunities for students of design to sharpen critical faculties in project-based work.
Current application domains include: realtime video and sound synthesis, sensors, physical computing, computer-mediated human interaction, media choreography, active fabric and wearable architecture. Topological media is physical and computational matter, image or sound fashioned as substances evolving under continuous action.
The TML draws insights from studies of embodiment and materiality, performance and music, as well as dynamical systems and differential geometry and topology. Its projects also serve as case studies in the construction of fresh modes of cultural knowledge and techno-scientific practice.
August 11, 2004
A platform for collaborative performance
"The Pavilion: Into the 21st Century -- A Laboratory for Social Experimentation"
by Randall Packer, PlaNetwork Journal
"Inspired by Billy Klüver's 1970 art and technology masterpiece, the Pepsi Pavilion, artist/theorist Randall Packer is creating a NASA-supported digital, interactive artwork that will also be a platform for collaborative performances." The project " responds to Klüver's idea of freeing the spectator to make his or her own connections in the experience of the artwork.
"Pavilion: Into The 21st Century" will provide a platform for creation, a programmable, interactive 'multimedia performance instrument,' a laboratory and showcase for media experimentation. This multimedia 'instrument' will offer a critical forum for researching and advancing the integration of art, music and science into singular, multi-disciplinary artworks. It will also allow for the consideration of the social implications of emerging forms of interactivity in networked environments."
For more information
With thanks to Ken Jordan