( via teeeming void )
Game/art notable Julian Oliver (aka delire) has been using game engines for audiovisual performance since way back. In 2001 I saw him play a Quake mod that had been rigged with audio samples and proximity triggers to create an immersive first-person performance tool; a digital hardcore jumping castle (I think the system was related to the later q3apd). In conversation at the same event, he argued for the potential of this approach. I saw the Quake mod as an ingenious sample trigger interface – a kind of 3D drum machine – but Oliver was looking ahead to realtime manipulation and deformation of geometry and sound. In retrospect he was evoking a form of synaesthetic media, where spatial and sonic attributes are fused and cross-mapped, so that the form is the sound. Gesture is significant here too – in performance practice gesture is at the interface of space, motion and sound. Oliver was imagining dynamic form as an articulation of sonic gesture, but also the prospect of folding back 3D form into sound; procedural texture-mapped geometry as a sonic provocation. What does this sound like?
This conversation came back to me vividly when I ran into fijuu2, a project by Oliver and Steven Pickles. Fijuu comes close to realising what Oliver imagined in 2001: a plastic, gestural, realtime audiovisual 3d environment. Forms twist, shatter and rotate, hovering inside cylindrical arcs of a gesture sequencer. Sound and form transform in unison, evoking a third, more abstract thing, the map or pattern that links them. Global filters influence sound and image, making another (logical) map between pixel shaders and audio effects. It’s great to see lush, gaming-grade 3d graphics diverted towards a more abstract aesthetics of play.