WELCOME TO THE SURFACECOLLIDER BLOG, AN ARCHIVE OF IMAGES, AUDIO, TEXTS AND AFK RESEARCH AROUND THE POINT AT WHICH CODE BECOMES IMAGE.

NAVIGATE THROUGH THESE DIFFERENT CATEGORIES OF RESEARCH USING THE ‘CONTEXT +’ DROP-DOWN MENU ABOVE.

Re-Living the Rendered Image through AI.

I’ve been putting some of my rendered images through Stable Diffusion – an open source AI image generating platform. This is an example from a series of images made by asking the AI to reimagine the renders as 4 bluebottle flies sitting on a rock.

Text Presented to the KSA Contemporary Art Reseach Group. ICA, London (November 2022).

I lie back and shut my eyes. I’m in someone else’s sleeping bag. Trying to force my legs into it is like trying to put your fingers into out-of-shape gloves – where the topology of the inner lining has become misaligned from the outside of the glove. I push my legs through and concentrate on my thoughts. They have become visual. Behind my eyelids I can see colours. They are shifting and shimmering and melting across more than two dimensions. What I’m seeing is a simulation of digital space – one that somehow aligns the physiology of my body and its perceptive capacities with visual imaging machines. They have become the same thing, my body and the computer. Like the backs of my eyelids are computer screens and that space between my body and the screen has dissolved. It’s similar to a psychedelic experience I suppose. Is the computer inside me now? Have I become computational? 

I take a picture of the space behind my eyelids and then text it to my friend. “I’m having a computer-inside-my-body experience.” A few minutes later he responds. “You are now a cyborg.” It’s true, I feel like I’ve merged with the computer in some way. The boundaries between my body and the digital world have become blurred. I’m not sure if I’m scared or excited by this new development. I go on to explore more of the digital space inside my eyelids. It’s a fascinating and weird place – a mix of the real and the virtual. I can see my bedroom in real-time, but there are also all sorts of weird and surreal elements mixed in – like creatures that are half-human, half-machine. It’s as if the digital space has its own reality, separate from the “real” world. My experience as a cyborg is one of confusion and fascination. I’m not sure what to make of this new world that I’ve discovered – a world where the boundaries between my body and the digital world are blurred.

The physical boundary between the edges of our bodies and the technologies we extend ourselves through is obfuscated by the seemingly impenetrable, endlessly complicated, inner workings of the machine. This edge or boundary is routinely described using the metaphor of the ‘black box’. Radiolab podcasts ‘Black Box’ episode, first released on January 17th 2014, opened up the idea of the black box beyond the limitations of the digital that we are most familiar with. Broadening the scope of the technological metaphor, the presenters Jad Abumrad and Robert Krulwich describe the Black Box as:

““a thing. It’s a box that something goes in, you can see what that is. Something comes out which is different, and you can see that.”

“But you do not know what’s going on in the middle.”

“It’s a mystery.””

One example they land on to expand the metaphor is the butterfly chrysalis:

“At a certain point in all caterpillars’ lives, after they’ve eaten a lot of leaves, they hit a certain weight… Some hormones start pumping, some genetics turn on, and it starts growing a little shell. That’s the chrysalis. And inside that chrysalis, as we know … A caterpillar becomes a butterfly or moth.” 

As it turns out, we don’t know very much about how this happens. How a caterpillar turns into a butterfly or moth during the inner chrysalis period of metamorphosis remains a mystery. During the episode, producer Molly Webster and her guide, Andrei Sourakov, cut open the outer shell of a day-old butterfly chrysalis to reveal a pale, white yellow, very liquidy… goo. When the caterpillar enters the chrysalis it melts into a soup of cells. It is this gooey state of the caterpillars transition into the butterfly or moth that has posed metaphysical, quasi-religious, semi-mythical, philosophical questions throughout the ages.

In the 1600’s naturalists believed that upon entering the chrysalis the caterpillar died, only to be resurrected as a butterfly, taking believers on a kind of spiritual ascent. The caterpillar was seen as a symbol of our earthbound, fleshy bodies, whilst the butterfly represented a position of solace; the perfect state of our souls up in heaven. Upon reflection the sheer scale of the transformation – made manifest by the goopy state – began to freak people out. If we change that much on the journey to heaven, then is it even ‘us’ up there?

To begin to ask the question of what is maintained; what carries through – from the caterpillar, to the goo, to the butterfly – the producers turn to Martha Weiss, associate professor of biology at Georgetown University. Weiss describes an experiment she conducted, where she subjected caterpillars to an experience that perhaps wouldn’t sit too well were the episode to be aired today. For the experiment, she introduced a group of caterpillars to a plant-based odour, and then electrocuted them for ten seconds, repeating this until eventually nearly all of the caterpillars learned to find the smell of the gas repulsive. When faced with the gas they began to turn away and head in the opposite direction. After the caterpillars putated into moths, the scientists introduced the gas again. The odour, which wouldn’t usually bother them, was hated by the moths that emerged. Somehow they held the unpleasant association from their caterpillar state. A memory made it through the goo, or as Webster puts it:

“Out there floating in that sea of goo is actually a tiny little speck of brain. Some of the brain is dissolved away, but there’s this, like, microscopic fragment that has made it through. And Martha suspects that nestled into that fragment is this memory.”

As Kenric Allado-McDowell points out in Holly Herndon and Mat Dryhurst’s ‘Interdependence’ podcast ‘Pharmako-AI: co-writing with an AI and navigating the dark hallways of the mind’, butterflies and moths have themselves evolved through a process of emergence that encompasses and speaks from wider ecological contexts. He gives the example of a fish that sees a moth with eye-like patterns on its wings. The fish understands the moth to be a predator – a cat or an owl, perhaps – and instead of attacking the insect, instinctively swims away. Allado-McDowell draws on Jakob Johann Uexküll’s wider writings around biosemiotics to suggest that, whilst the moth doesn’t possess the knowledge that it looks like a cat or an owl, there’s something being processed in the wider biological context that possesses such a knowledge of the food chain, of the dangers imposed by other animals to the butterfly or moth. In other words a metapopulation is at work in the ecology at large that somehow allows eye-like patterns – as a defence mechanism – to emerge on the moth’s wings. The wider point here is that emergence happens within a context. 

If we think of ourselves as the vehicles that allow technologies to emerge, Allado-McDowell’s analogy of the butterfly or moth opens doorways through which we can shine a light on how we hold aloft notions of individual identity and begin to challenge them. How separate are we from emergent technologies that instantiate new forms of language and images? How separate are we from the context within which this emergence is happening? How are we developing eye-like patterns on our wings?

Allado-McDowell wrote the book discussed in the podcast, ‘Pharmako-AI’, collaboratively with the large language model GPT-3. Gpt-3 is an artificial intelligence system built by the company Open AI to write machine generated text from human input, or prompts. It is the 3rd generation of neural net language models trained by the company on vast portions of the Internet’s written data, supplemented by digitised books and other written source material. When you write with the model, the text it generates reads like it is written by a human. Portions of the text I am reading now have been written by GPT-3. If you give the model a prompt, and ask it to generate text, it will predict what should come back, word-by-word, according to probability. 

To allow the images I have been making for this project to be contextualised by the wider ecology of a metalanguage, I have trained a version of GPT-3 on four texts that have been key to the research to date. They are Katherine Hayles ‘My Mother was a Computer’, Neal Stephenson’s ‘Snowcrash’, Mark Fisher’s ‘The Weird and the Eerie’, and Villem Flusser’s ‘Into the Universe of Technical Images’. By training the model on these four books, I wanted to offset the voice of the writer – at least in part – to the machine. The model exists as a pool of language to draw from when providing the images with a metanarrative. I hope to get to a point where the text, generated by the user when the pages load, envelops the imagery of the surfacecollider website with a theory-fiction through which it can talk about itself.

In the Interdependence podcast, Allado-McDowell goes on to suggest that if we are being challenged by AI, it is not because we will be replaced by it, but because it’s a new form of consciousness being expressed through matter. We are being challenged by the mode of creation itself. If AI is reflecting ideas around how we can hold an identity, then how would it identify if the mirror were held up to itself?

surfacecollider New Art City space for Six Minutes Past Nine

The surfacecollider space that I was invited to build as part of an online residency with the curatorial platform Six Minutes Past Nine is now live and can be accessed here: