SAN FRANCISCO — Android OS version 4.1, also known as Jelly Bean, will ship mid-July, Google announced at its I/O Conference this week. This new version of the operating system includes a ground-up reworking of the graphics rendering system, and brings triple buffering and vsync to bear on all OS animations.

Codenamed Project Butter, the effort to smooth out animations in Jelly Bean was demonstrated with a high-speed camera. Dave Burke, the Google engineer charged with directing Project Butter, said that all animations are now rendered at 60 frames per second across all applications and without the need for new code from third-party application developers.

Those independent developers will also be pleased with another major change in Jelly Bean: Applications are now encrypted with a device-specific key. This will help to cut down on the rampant piracy present on the Android platform. (A quick search of popular file-sharing site The Pirate Bay yielded dozens of Android collections for download, with some pirate torrents that included well over 3,000 Android applications.)

Hugo Barra, director of the Android platform at Google, said that signed encryption will ensure developers will earn money from their work. He also reintroduced the revamped Google Play store, formerly known as the Android Marketplace. The new store resembles the store interface used by Amazon’s Kindle Fire tablet, which offers large boxes with movies, books and music made available for purchase in a flashier style than the typical app store’s.

For developers looking to optimize their Android applications, Jelly Bean includes a new tool known as Systrace. This new performance profiler offers a dynamic graph of performance across the Android operating system. In a demonstration, Barra showed a profile in which the database was the bottleneck, and Systrace showed a gap in all performance graphs whenever the database was used, making it obvious where the problem lied.

For hardware manufacturers and designers, Android now offers a Platform Development Kit. The PDK includes documentation and source code for the low-level APIs in Android, and will help hardware designers to build their devices from the ground up to support Android. The PDK will be made available two to three months before every new Android OS release, starting with Jelly Bean, and its PDK is available today.

Accessibility is a major new theme in Jelly Bean as well. The Android team has unleashed speech-to-text recognition from the Internet, meaning offline users can now talk to their phones. This has further implications for blind users, who are now targeted with a combination of this speech recognition, new gesture-based interface controls, and added support for braille devices.
#!
Search in the spotlight
Google Now is the newest search capability on Android. With a user’s search history, location data and calendar info, Google Now can tailor search results for that user. Given a calendar item referring to specific flight information, Google Now returns results that may include the arrival time for that flight, public transit information for getting to the airport, and any other pertinent information it can find.

Google’s other product lines were also updated at Google I/O. The company pushed hard to expand the capabilities of its social network, Google+. On the second day of the show, the company unveiled versions of its Chrome browser for Android under Jelly Bean, and for both iPhone and iPad.

For cloud users, Google App Engine has been a popular way to host applications on Google’s hardware. At Google I/O, Urs Hölzle, fellow and senior vice president of technical infrastructure at Google, introduced the Google Compute Engine, which allows users to run virtual machines inside of Google’s infrastructure.

“In 2008, we launched App Engine,” said Hölzle. “It lets you write simple, intuitive code to build your applications, then we take that and scale it. Today, it’s supporting over a million active applications. App Engine receives 7.5 billion hits per day, and performs 2 trillion data store operations per month. We’re very proud of what App Engine has enabled developers to do. You’ve told us you want VMs on-demand, with industry-leading performance and scalability.”

Unlike App Engine, which restricts users to specific languages and storage methods, Google Compute Engine is a standard public cloud offering that can host virtual machines. Hölzle showed a demonstration of the system with a genome mutation connection-detection application. When run on a 1,000-core cluster, he said, the application took 10 minutes to find each connection. On 10,000 cores in Google Compute Engine, finding each connection took seconds. He then cranked the application up to 600,000 cores, and watched as connections were discovered in milliseconds.

Hölzle said Google Compute Engine is available in a preview form today. He said the pricing scheme for it would be lower than industry standards, but did not detail pricing any further.

The company also announced the availability of its first consumer product, the Nexus Q. This spherical device plugs into televisions and stereos to bring content purchased on the Google Play store to home media systems. The Nexus Q ships in mid-July, and is manufactured in America.

Google Drive was also updated, with the online storage service now supporting clients for Android and iOS. Chrome OS was also added to the list of supported platforms.
#!
Glass’ big splash
But it was the first-day keynote demonstration of Google Glass that was, without a doubt, the highlight of the conference. Glass is an experimental eyeglass-based display that records video and audio, and presents information through an eyepiece mounted just above eye-level.

For the keynote, Google cofounder Sergey Brin appeared on stage and asked the crowd if they would like to see Glass demonstrated. He then informed the audience that he had lent his glasses to someone. That someone happened to be a skydiver in a zeppelin above the conference center. Said skydiver and four friends then parachuted to the roof of the Moscone Center, rappelled down the side, and brought the glasses to Brin on stage. The keynote projector screens streamed live video feeds from all of the skydivers as they plummeted, and did so right up until they arrived on stage.

Brin then detailed what applications he envisions for the device. He explained that Glass is designed to augment reality, not get in its way. He envisioned applications that used facial recognition to tell you information about the person you’re speaking to, or that directed users to the nearest restaurants.

Additionally, Glass is always recording video and audio. Thus, when something noteworthy happens, Glass wearers simply tap a button, and the last few minutes of video are saved. Thus, a wearer will never miss the chance to film something exciting, even if they aren’t ready to film when it happens.

Chromebooks weren’t left out of the Google love fest. Chrome OS is continuing to evolve, and Google announced at the show that it would begin retailing Chromebooks at Best Buy.