If you’re reading this, there’s a better than 90% chance you’ve seen “Star Trek: The Next Generation.” If so, you’re familiar with the Holodeck: a room where 3D interactive worlds can be generated for users. We’re nowhere near those 3D projected light objects, but we are getting remarkably close to the ability to make a room look like just about anything needed.

The technology behind this is called projection mapping, and it’s often used in art installations and galleries where entire rooms are intended to be transformed into something entirely impossible. The results are rooms covered in shifting mandalas, and boxes that can change their skin like an octopus.

IRMA / Save me from SUPERBIEN on Vimeo.

The above music video was shot in one take with a four-wall projection-mapping setup. You can see that the effects are quite stunning, but even more so when you realize this was in real time and could be performed live.

There are quite a few building blocks for this type of installation piece, but the absolute most fundamental block is Syphon, an open-source project that allows images to be shared across applications at high speeds without any frame loss or dropping.

Syphon doesn’t do the projection mapping, it just distributes the images being projected from the coordination application to the projection controller apps. It will eventually be able to stream full-frame video into other applications, and it can be used with Unity for game development.

Rather than go through the lengthy list of all the projection-mapping projects and applications that build on top of Syphon, we’d rather just show you this example: a cemetery turned into a cartoon.

We caught up with one of Syphon’s two creators, Tom Butterworth, to discuss the project.

SD Times: What got you started on Syphon?
Butterworth: Both Anton (Marini, co-author of Syphon) and I had been working on software for live performance for some years. The only way to mix the video output of two programs live was to use two computers and a hardware video mixer. Passing the output of one into another for further processing required even more hardware. This was expensive and for most budgets was limited to sub-HD resolutions. Latency was also an issue.

In 2009 Apple’s Mac OS X 10.6 introduced the IOSurface API, which allowed programs to share resources on the graphics card. We saw the opportunity to use this for mixing and chaining live video, and Anton put together a proof of concept in a few days.

That’s when I really jumped in, and we spent several months improving performance and talking with other developers to produce a useful API, as well as building some plug-ins and implementations for the environments people were using (Quartz Composer, Max/MSP, FreeFrameGL). At that point other people with access to our early betas were getting going on implementations for environments like Processing, openFrameworks and Unity.

How did you get interested in software development?
It’s been a bit of a hobby since I was a teenager; I started on HyperTalk in Apple’s long-forgotten HyperCard. I studied philosophy and literature at university, and later in life I was playing around with Super 8 film. Thinking about editing and cutting physical film led me into programming from a totally new angle, affecting video, which has been the focus of all my work since.

What was your first computer?
Apple Mac Classic II.

When was the first time you used Syphon to create an installation? How did people react?
I do a lot more work programming than I do performing, so it’s only recently that I’ve used Syphon in earnest. I hope the fact I was using it was invisible to the audience. I was using it to connect video from a DSLR to the software I was using as part of a dance performance.

What’s the most impressive installation or project you’ve seen Syphon used in?
I’m constantly surprised where it turns up—in arenas, television studios and architectural projections. I’m as excited by the small things people are doing with Syphon, such as musicians joining up software they’re already using to new tools to begin to explore video, and people writing small, highly specialized apps for installations and performances.

One hope when we released Syphon was that it would mean performers weren’t locked in to using a single monolithic app, and it’s rewarding to see the sheer number of tools people are mixing and customizing to create new workflows. For instance, Alejandro Crawford tours with the band MGMT performing visuals, and he’s been using Syphon in an elaborate setup to take video from a quadcopter above the crowd, pipe it through some custom effects built in Max, and then into Unity and back into the software he uses for projection.

What’s the weirdest thing Syphon’s been used for?
When you write software for performers, you become blind to weird.

What’s next?
We’ve joined up apps. We’d love to make it as easy and cheap to connect video between machines.