“Star Trek.” “2001.” “Minority Report.” This is just a short list of movies and TV shows that have created indelible cultural images of computer technology. Today, that list is growing at what seems like an exponential curve, as our obsession with computers, tablets, smartphones and software surges. Diving into what all this code means about our relationship to software is an anthropologist’s dream.
On Jan. 2, 2014, London-based computer scientist and scientific history buff John Graham-Cumming tweeted a shot of code from the 2013 movie “Elysium” next to its mundane origins. The plot centers on a dystopia in the year 2154, so it’s ironic that the code used to reboot the space station is in fact assembly language from the Intel Architecture Software Developer’s Manual, Volume 3. Come to think of it, will systems still require a reboot in 2154? Will “Intel inside” still be a meaningful advertising slogan? Will anyone still program in assembly?
Codespotting takes off
After his tweet went viral, Graham-Cumming, author of “The Geek Atlas” and the man behind a successful petition requesting the British government’s apology for its treatment of Alan Turing, started Source Code in TV and Films. The blog contains images of code he and others submitted, along with succinct explanations of their origins.
(Related: More about Graham-Cumming’s blog)
Some are funny, such as the shots of eBay’s website CSS code used to illustrate “very dangerous hacker activities” on a television show. Similarly, in the 2005 movie “Stealth,” robotic fighter planes (that would perhaps be called drones today) rapidly become sentient. An image of the planes’ artificial intelligence source code turns out to be raw TeX/LaTeX markup for nonsensical mathematics equations.
All this software speaks to a change in perception, Graham-Cumming believed. “I think programming has become a bit cooler than it used to be. Using computers or writing code is used as a core part of the plot. You see people configuring things or writing custom code, and programming is seen as an interesting activity.”#!Smoke and mirrors
Not all code seen on TV can be forensically traced to some innocuous command-line screen or markup text, however. Increasingly, software depicted in movies is custom-built by a Chicago-based design team.
One of only two main U.S. producers of fake movie and TV GUIs, Twisted Media is a creative collective founded by visual effects designer Derek Frederickson, who parlayed a background in interactive CD-ROMs in the 1990s and guided multimedia presentations into an accidental—and highly successful—career in creating multimedia for TV shows.
It all started with an e-mail in 2007. “It’s just blind luck. What got me launched is
‘Leverage,’ ” he said. “The pilot of that show was shot here in Chicago. I got a random e-mail from the visual effects supervisor’s assistant saying, ‘Can you do graphics?’ The executive producer was Dean Devlin, who wrote ‘Independence Day’ and ‘Stargate.’ He’s a hero of mine. I ended up having the interview and got the gig. I worked on ‘Torchwood’ because the producer was on season two of ‘Leverage.’ Now I’m working on three prime-time shows.”
The spy gadgetry and high-tech interfaces Frederickson’s become known for weren’t done as post-production green-screen replacements. Rather, he created on-set computer graphics that actors could interact with live in the filming of 77 episodes of TNT’s “Leverage.”
All of the interactive pieces are Adobe Flash, with plenty of work done in Photoshop, Illustrator, Cinema, Maya, Max and After Effects for various graphic, animation and 3D modeling tasks. But the task of coming up with a compelling design falls squarely on Frederickson’s team.
“Depending on the director, we may get a lot of input or next to none—and with most of them, it’s none,” he said. “ ‘Make it awesome, make it cool,’ and if it’s not quite cool enough we make it again.
“One of the characters that I worked with for years on Leverage was Hardison, the hacker. Sometimes the writers would simply write, ‘Hardison hacks.’ We got to a point in the show where whatever he does, we would just make it up.”
Unless they’ve been cleared for commercial use, the interfaces pictured in TV shows and movies must be original. “I make sure I have rights, or I make it from scratch,” said Frederickson. “Hardison always had his own OS.”
The square that appears on the home button of an iPhone must be greeked out, while all the smartphone’s icons look remarkably realistic but are original. The once-famous Microsoft “Blue Screen of Death” can’t be used, so something similar has to be created. Another facsimile was a game that was similar to Words With Friends, which was played over the course of several episodes. According to Frederickson, “I had to design the game for phones, iPads and computers, and pre-program it so it would finish the way the script said.”#!The OS of the future
Part of the fun of designing movie GUIs is the opportunity to invent something. “Futuristic stuff is always fun, because you try and think of what isn’t right now and try to imagine what’s better, cooler, or even going a little retro,” said Frederickson, who had to do just that for the movie “Divergent,” which is set 150 years in the future in a post-apocalyptic dystopia.
“I worked with Andy Nicholson, who was the production designer. He also did [the movie] ‘Gravity,’ ” said Frederickson. “It was a lot of fun to come in totally fresh to the project as a designer. There weren’t that many scenes with technology in the movie, but I was designing the OS of the future for that world. Future screens are square, that’s what I take away from it,” he said with a laugh. He also noted that the screens of the future, at least in current movies, tend to be blue.
Luckily, none of the technology actually has to work. One project involved finding something colorful and visually arresting to represent an app that a bad guy used to change his voice. Another movie will portray actors interacting with holograms, which will primarily be done in post-production.
In a current sci-fi project, Frederickson is working with a scientist: “They hired a theoretical physicist to help with the actual equations. The e-mails I have from this woman are just outstanding. She’s incredibly smart and detail-oriented. It’s going to be hilarious because at a certain point I’m just going to have to say, ‘You do realize that it’s my job to just make it look pretty.’ I’ll take all those real equations and turn them into something that fits into the movie.”
Blinking red and bigger than life
So what do the movies get wrong these days about coding, aside from simply throwing up random and unrelated source code snippets?
“In general, there’s the perception that you can do almost anything,” said Graham-Cumming. “Like on ‘CSI,’ you see them infinitely zoom and enhance from one pixel. There’s an overreach from the powerful data processing we have now.”
At the same time, it’s also true that we’ve rapidly become accustomed to the power of smartphones: “The technology behind the map on the GPS in your pocket is actually incredible,” said Graham-Cumming. “It’s built on a constellation of satellites around the planet, but it’s become just run of the mill that you can find a Starbucks with one-meter accuracy in the middle of the city.”
And we’re all used to seeing messages that blare the obvious on TV. “Most audiences need the big red blinking thing that says ‘Bomb defused,’ ” said Frederickson. “I’m working on a show right now called ‘Crisis,’ and I’m trying to be true to what happens in real-life programming. The comms gunwoman [a bad guy] is a hacker. She would be typing things into a command-line, and I started trying to be more realistic, where it wasn’t 48-point blinking font. The camera guy was just like, ‘Dude what are you doing?’ Shows aren’t about graphics or about code, they’re about the story. If there’s a story point that has to be made and there’s just a fraction of a second to focus in on it, I have to make whatever it is bigger.”
If TV is to be believed, there’s also been a command-center spending-spree in America, with every law enforcement unit tricked out with floor-to-ceiling touch-screens. “I think when production designers are building the sets, it’s an easy way to make a room interesting and give a lot of life to it,” said Frederickson. He noted, however, that while he’s visited several fantastic real-world command centers for police or firefighting, the GUIs themselves tend to be bland: “Everything looks like Microsoft Office—not as interesting as it could be.”#!What do movie computers say about us?
Not surprisingly, there’s been an evolution in the Hollywood portrayal of computers and geeks. “In the early days, especially the 1970s, computers were evil, like in the 1973 Michael Crichton movie ‘Westworld.’ That has declined as computers go everywhere,” said Graham-Cumming. Computers that pass the Turing Test are a common theme too, which is a separate question from good or evil. “You could argue that in the movie ‘2001,’ HAL wasn’t really evil, just ensuring the success of the mission,” he said.
A new TV show, “Intelligence,” features a government agent who has been enhanced by putting a chip in his brain. His superpower? Always being connected to the Internet. Does that speak to our current obsession with the Web?
“We’re certainly addicted to finding out the answer to things on the Internet,” said Graham-Cumming, who announced a reduction in his own online activities a few years ago. “I fear now you can never ask someone a question, you can just look on Google and the answer is instantaneous.”
As smartphones proliferate, TV shows ranging from “Sherlock” to “The Mindy Project” have played with adding a meta-layer to the action by superimposing text messages or other graphics overlays on top of the live action rather than embedded in it. When done well, it can enhance the story, but it can also distract from the plot, just as the sound of an iPhone text message emanating from the TV is sure to cause many viewers to do a Pavlovian double-take at their own phone.
The social media-driven meta-analysis of TV is also exploding. “It’s an anthropological-type thing,” said Graham-Cumming. “Someone has a Tumblr of every computer used in the series ‘Law and Order,’ which has been running for many years. There are details on every computer and what it was used for. It’s pretty interesting.”
And audiences are eating it up. Graham-Cumming recently received fan mail from a doctor who loved Source Code in TV and Films, comparing it to the medical errors shown on TV. Similar to past efforts by academics to debunk physics portrayed movies ranging from “Star Wars” to “Mission Impossible,” the new attention to movie code may also be an opportunity to build historic understanding of computer science, one of his prime motivations for writing about scientific history and honoring Alan Turing.
“I’d like to write a book about how computers actually work, because people are completely mystified by them, but the truth of it is it’s very simple,” said Graham-Cumming. “A common misconception is you have to be a mathematician. Another one is that it involves mind-numbing amounts of detail.”
In highlighting movie code, he’s made one thing very clear: “It’s certainly true that software is absolutely everywhere. The general public hasn’t cottoned on to how much software is running the world, from determining your washing machine cycle to when your airbag deploys.”