We caught up with one of Syphon’s two creators, Tom Butterworth, to discuss the project.
SD Times: What got you started on Syphon?
Butterworth: Both Anton (Marini, co-author of Syphon) and I had been working on software for live performance for some years. The only way to mix the video output of two programs live was to use two computers and a hardware video mixer. Passing the output of one into another for further processing required even more hardware. This was expensive and for most budgets was limited to sub-HD resolutions. Latency was also an issue.
In 2009 Apple’s Mac OS X 10.6 introduced the IOSurface API, which allowed programs to share resources on the graphics card. We saw the opportunity to use this for mixing and chaining live video, and Anton put together a proof of concept in a few days.
That’s when I really jumped in, and we spent several months improving performance and talking with other developers to produce a useful API, as well as building some plug-ins and implementations for the environments people were using (Quartz Composer, Max/MSP, FreeFrameGL). At that point other people with access to our early betas were getting going on implementations for environments like Processing, openFrameworks and Unity.
How did you get interested in software development?
It’s been a bit of a hobby since I was a teenager; I started on HyperTalk in Apple’s long-forgotten HyperCard. I studied philosophy and literature at university, and later in life I was playing around with Super 8 film. Thinking about editing and cutting physical film led me into programming from a totally new angle, affecting video, which has been the focus of all my work since.
What was your first computer?
Apple Mac Classic II.
When was the first time you used Syphon to create an installation? How did people react?
I do a lot more work programming than I do performing, so it’s only recently that I’ve used Syphon in earnest. I hope the fact I was using it was invisible to the audience. I was using it to connect video from a DSLR to the software I was using as part of a dance performance.
What’s the most impressive installation or project you’ve seen Syphon used in?
I’m constantly surprised where it turns up—in arenas, television studios and architectural projections. I’m as excited by the small things people are doing with Syphon, such as musicians joining up software they’re already using to new tools to begin to explore video, and people writing small, highly specialized apps for installations and performances.
One hope when we released Syphon was that it would mean performers weren’t locked in to using a single monolithic app, and it’s rewarding to see the sheer number of tools people are mixing and customizing to create new workflows. For instance, Alejandro Crawford tours with the band MGMT performing visuals, and he’s been using Syphon in an elaborate setup to take video from a quadcopter above the crowd, pipe it through some custom effects built in Max, and then into Unity and back into the software he uses for projection.