← Back to Work

Live TouchDesigner Visuals

2025 · live-av, performance, interactive, ai

About

The project employed a dual-computer setup where one machine analyzed audio from DJ equipment and sent analysis data via UDP to a second computer running the visuals. The visual pipeline began with Kinect body tracking data, which was textured with various forms of noise animated using the audio data, and then processed through Stream Diffusion.

During the event, people discovered they could adjust AI prompts and denoising parameters to generate their own little collaborative dreams. This experience meshed with those enjoying the music and feeling the additional energy from abstract and representational forms moving with them.

Artist's Intention

Generally the music and the visuals at an event are exclusively components of the party that an audience consumes. Adding all of these interactive elements turned this notion on its head as anyone willing to put forth extra effort into participating could have immediate influence and feedback. The method of participation was very conscious, but then the experience and feedback was unconscious as they became immersed in the somewhat unpredictable idea they put forth into the system. Ideally in the moments they are experiencing their music and visual concoction they lose sense of reality and are fully embodied by the encompassing music and visuals.

Gallery