To see just how much interface paradigms have changed, grab the nearest 8-year-old and see how they react to a computer, tablet or video game. Chances are they’ll jab at the screen expecting touch controls, or pick up an interface device and swing it around like a Wii controller. Take a picture with a film camera, and the 8-year-old will demand to see the image on the little screen that should exist on the back of it.
Add to this mix of new interfaces the Microsoft Kinect, and it shouldn’t be long before youngsters everywhere are waving madly at computer screens, expecting them to recognize the movements. If Intel and Creative Labs have their way, however, computers should soon be able to understand such arm-waving.
Barry Solomon, strategist for Intel Perceptual, is working hard to make computers understand what they see through their webcams. “It’s about moving beyond a keyboard, mouse and touch, and providing new means for users to interact with their computing devices,” he said. “The whole point is to add senses to the computer’s brain so it can receive more info from the users. It can read information like where the user is looking, what the user is saying, [and] where the user’s hands are to create a new interactivity paradigm.”
The Perceptual Computing SDK, created by Intel and Creative Labs, includes what looks like a standard USB webcam, and for now is intended to work with just laptops and desktops. Under the covers, however, there’s technology similar to Kinect’s that allows the camera to measure depth and recognize objects that are moving in front of others. That means a hand waving in front of a busy background can still be picked out and recognized. Perhaps that waving hand pushes a Web page aside, or scrolls a list on-screen.
There is one major difference between how the Kinect is used and how Perceptual works, and it’s a difference similar to that which exists between computer and console video games: The user of a Kinect is across the room, while the user of perceptual computing devices will be seated in front of their computer, within three feet of the screen.
“Midair gesture was of strong interest as a means of interacting, for developers,” said Solomon. “What we’ve been focused on is creating that close-range experience for users. Within three feet of their computing device, they could make use of this.”
Solomon said that developers are of key importance to making perceptual computing a pervasive interface medium. “For developers, we’re in the beta phase. We launched the SDK last fall at Intel Developer Forum,” he said. “Anyone can order a camera (from intel.com/software/perceptual). We’re at the point now where what we’re really trying to do is encourage innovation, and inspire and cultivate an ecosystem so developers will adapt these new interactivity modes into their products and dream up things we haven’t even thought of.”