Over the past two years, I have become increasingly interested in the rich potential of working with hybrid analog and digital systems for real time performative processing. There is an interesting tension that occurs when combining the precision of digital with the unpredictability of analog. I used a hybrid setup for my recent performance Multiplying Muybridge where I used an analog synthesizer to create live sound that was manipulated by a theremin-like depth sensor that outputs control voltage to the system. This instrument, along with a handful of oscillators, are digitized in MIDI and used to generate patterns in Processing. Those patterns are then sent to Max via Syphon to chromakey 6 separate videos into the patterns. Each of the three patterns can be controlled via a USB MIDI controller, two knobs for each X and Y direction. I use OSC (Open Sound Control) to pass that from MAX to Processing.
Why describe this in such detail? I am interested in the notion of open modular systems of making hybrid works. Philosophically, this tool creation and workflow is inspired by the design sensibilities of analog video synthesizers created in the 1970’s by toolmakers such as Dan Sandin, Bill Hearn and Dave Jones. The relationship that these systems had with control voltage interfaces for maximum variability stands in stark contrast to many of the professional software packages used today.
In December 2015, my friend and fellow artist Jason Bernagozzi, started a collaboration to develop software tools for time-based media artists as a way to support the fundraising efforts of Signal Culture. We wanted to make software inspired by the open systems of these early video pioneers and philosophically grounded in modularity. We are interested in creating tools that are real time, performative, modular, and exploratory. The first app Jason and I wanted to work on was a process inspired by the classic Frame Buffer created by legendary toolmaker Dave Jones of Dave Jones Design, who has made significant contributions to the history of video art. The Frame Buffer application saves a series of video frames into memory that repeat over one another within the keyed areas of either a lumakey or a chromakey. The process is simple, however, what we wanted to focus on was how to make the application able to be controlled by a wide range of sources.
The first thing we created was the capability to ingest a wide range of video sources, such as web cameras, external cameras (via fierwire or thunderbolt), QuickTime movie files, and to or from syphon. We designed it to cover a wide range of possible resolutions from Classic Frame Buffer 256 x 256 all the way to 1920 x 1080 HD. You can also control the frame rate of the video output, which can be sent out to external devices, recorded to a QuickTime, syphoned out to another software application, or be full-screened to be used for a performance. As far as the process itself, we wanted it to be intuitive for the user, which makes for a difficult balance of fine tuning and narrowing down parameter ranges without making it so limiting that it acts like a filter you would apply in a nonlinear editing or compositing program.
There are seven parameters that can be explored in the app. Being concerned with performability, we made it so all the parameters could be controlled via MIDI or OSC. The user could have an analog synth control the parameters, hook in a CV->MIDI interface or use an OSC touch interface on your phone control the app, making it able to run video and control between several software packages such as Ableton Live, VDMX, Processing, etc all in real time.
We released the Frame Buffer app in January 2016 as a part of Signal Culture’s sustainability fundraising campaign. Signal Culture is a nonprofit experimental media art organization offering residencies, resources, and exhibition opportunities. The Frame Buffer is the first of six applications we have planed for 2016. Check out the Signal Culture App club for more details. The exciting part of making these tools is seeing what artists make with them. I want to share two works by artists that have used the Frame Buffer in their new works, “Negative Vibes/////Rough Idol” by Patrick James Cain, a sound and video artist residing in Washington D.C., and “Mix Buffer” by Alan Powell, a Video Artist and Associate Professor at Arcadia University.
Jason and I are now finalizing our second app titled Maelstrom, which was based on a process I developed for a 2012 project “Life in the Maelstrom”. Maelstrom combines real time lumakeying and pixel sorting paired with digital feedback to create repetitions into infinity. The app allows the user to control the direction of the feedback, zoom in and out, and rotate the angle of each repetition in space. During the development process of Maelstrom, I created a new performance titled “Synaptic Transmissions.” Working with the app in relation to audio visual performance led us to new ideas for future apps, in particular methods that would help create audiovisual sync. A simple example of this would be to use frame difference and Image brightness average for MIDI or OSC output.
We are not alone in developing creative tools for artists. There is an exciting renaissance of artists and toolmakers sharing and creating tools. Our goals are pedagogical in nature, to think about process versus effects. An effect is meant as an illusion, a real time process however can be used to articulate new visual and aural metaphors that come out of discovery and a relationship between the artist and their tools. In many ways this connects real time media production to music. It is not the inherent sound of the instrument that is significant, it is the choices the artist makes that creates the melody.
By: Eric Souther, New Media Artist, Assistant Professor of New Media at Indiana University South Bend