WELCOME TO THE SURFACECOLLIDER BLOG, AN ARCHIVE OF IMAGES, AUDIO, TEXTS AND AFK RESEARCH AROUND THE POINT AT WHICH CODE BECOMES IMAGE.

NAVIGATE THROUGH THESE DIFFERENT CATEGORIES OF RESEARCH USING THE ‘CONTEXT +’ DROP-DOWN MENU ABOVE.

Maggie Roberts talks Synthetic Imagery…

I spoke to Artist Maggie Roberts about her long history of working with synthetic digital images, both independently and as a part of Orphan Drift. Here’s an 18 minute edit of our Zoom chat transcribed and floating over a selection of the surfacecollider stuff…

Removal from James Irwin on Vimeo.

Vitalism as a streamed performance

I’ve been building towards new work in the studio recently, using surfacecollider.net as the basis for a screen-based performance of the images and sounds generated through the web pages as I navigate through them; a collaboration between me as wetware, the software and the hardware. As a live streamed event (through YouTube or Twitch) the work will come together and find form across the computer screens of the audience.

The webgl generated imagery will be streamed through an external web camera positioned closely in front of the computer display the web pages are playing through. Remediating the imagery through the camera imbues the digital images with the light of the room. The new images are hybrids, made by combining aesthetics inherent to camera footage and those of synthetic digital images. They become haunted by a degraded eeriness, void of the signs and signifiers which would enable us to categorise them. The recombinant imagery becomes a new entity district from all of the combined elements.

The sound accompanying the work will be generated by mic’d up objects in the studio in a way that also takes on the acoustics of the room (studio). Various pieces of hardware sit on top of cardboard boxes, which are all used to generate the sound. Electromagnetic coil and contact mics turn otherwise indecipherable electrical and kinetic energy into sound as I move them across the different electric devices – LCD displays, amplifiers, Arduino microcontrollers, etc. The drone-like sound generated through the work is amplified and reverberated through the room by exciter speakers placed on the surface of the cardboard boxes. The sound of the room is in turn relayed through microphones to the loudspeakers of the audience.


What you’re getting as a viewer on the other end of the live stream is this remediated tech eye or ear perspective of the work. I’m interested in exploring what happens when the technology used to interface with digital images is imbued with vitalism. How do we flatten the plane between us and tech, and let the electronic circuitry and mechanics of digital hardware come to life through code? Hidden behind the work, a new body, vital to the work and yet invisible to the viewer, will produce images and sound across the screens and loudspeakers of the audience and offer a fresh perspective on the technologies which extend and link us, one which inserts life into technology as prosthetic extensions to the body.