WELCOME TO THE SURFACECOLLIDER BLOG, AN ARCHIVE OF IMAGES, AUDIO, TEXTS AND AFK RESEARCH AROUND THE POINT AT WHICH CODE BECOMES IMAGE.

NAVIGATE THROUGH THESE DIFFERENT CATEGORIES OF RESEARCH USING THE ‘CONTEXT +’ DROP-DOWN MENU ABOVE.

Research Presentation at Raven Row Gallery for the Contemporary Art Research Group.

Interview Walkthrough of surfacecollider space for New Art City Festival 2023

I chatted about the New Art City surfacecollider space with the brilliant Sammie Veeler. The work was a part of New Art City Festival 2023.

Artist Talk at Kingston School of Art

James Irwin is an artist and PhD researcher at the Contemporary Art Research Centre, Kingston. He works with web technologies, AI systems and digital sound and image to investigate the notion of a vital life force inherent within digital media.

By creating cognitive assemblages – made from a combination of networked digital hardware, software and human wetware – his work builds from new materialist ideas around recentering the human, undoing our role as autonomous individuals and pointing to the ways in which the production of subjectivity is offset to forces outside of our bodies; the posthuman is biological, but also networked and dispersed through machines.

His talk at Kingston will be based around the AI text generating systems (a fine tuned GPT-3 model) and images housed at surfacecollider.net (ongoing). The work bypasses photographic modes of representation to construct spaces within Contemporary Art where the viewer is brought together with the synthetic languages of machines. Across the pages of the surfacecollider website new writings and images are triggered by user interaction, recasting the visitor as an active agent in the production of the work alongside hardware and software systems, the artist, and previous visitors who have also contributed to the chain.

surfacecollider New Art City space for Six Minutes Past Nine

The surfacecollider space that I was invited to build as part of an online residency with the curatorial platform Six Minutes Past Nine is now live and can be accessed here:

Maggie Roberts talks Synthetic Imagery…

I spoke to Artist Maggie Roberts about her long history of working with synthetic digital images, both independently and as a part of Orphan Drift. Here’s an 18 minute edit of our Zoom chat transcribed and floating over a selection of the surfacecollider stuff…

Talking Synthetic Images with Maggie Roberts from James Irwin on Vimeo.

Vitalism as a streamed performance

I’ve been building towards new work in the studio recently, using surfacecollider.net as the basis for a screen-based performance of the images and sounds generated through the web pages as I navigate through them; a collaboration between me as wetware, the software and the hardware. As a live streamed event (through YouTube or Twitch) the work will come together and find form across the computer screens of the audience.

The webgl generated imagery will be streamed through an external web camera positioned closely in front of the computer display the web pages are playing through. Remediating the imagery through the camera imbues the digital images with the light of the room. The new images are hybrids, made by combining aesthetics inherent to camera footage and those of synthetic digital images. They become haunted by a degraded eeriness, void of the signs and signifiers which would enable us to categorise them. The recombinant imagery becomes a new entity district from all of the combined elements.

The sound accompanying the work will be generated by mic’d up objects in the studio in a way that also takes on the acoustics of the room (studio). Various pieces of hardware sit on top of cardboard boxes, which are all used to generate the sound. Electromagnetic coil and contact mics turn otherwise indecipherable electrical and kinetic energy into sound as I move them across the different electric devices – LCD displays, amplifiers, Arduino microcontrollers, etc. The drone-like sound generated through the work is amplified and reverberated through the room by exciter speakers placed on the surface of the cardboard boxes. The sound of the room is in turn relayed through microphones to the loudspeakers of the audience.


What you’re getting as a viewer on the other end of the live stream is this remediated tech eye or ear perspective of the work. I’m interested in exploring what happens when the technology used to interface with digital images is imbued with vitalism. How do we flatten the plane between us and tech, and let the electronic circuitry and mechanics of digital hardware come to life through code? Hidden behind the work, a new body, vital to the work and yet invisible to the viewer, will produce images and sound across the screens and loudspeakers of the audience and offer a fresh perspective on the technologies which extend and link us, one which inserts life into technology as prosthetic extensions to the body.