Google is looking to improve virtual reality experiences with new experiments and SDKs. The company announced that it is open sourcing Resonance Audio, its spatial audio SDK released last year. Google is also providing new insights into its experiments with light fields, which are a set of algorithms for advanced capture, stitching, and rendering.
“To accelerate adoption of immersive audio technology and strengthen the developer community around it, we’re opening Resonance Audio to a community-driven development model. By creating an open source spatial audio project optimized for mobile and desktop computing, any platform or software development tool provider can easily integrate with Resonance Audio. More cross-platform and tooling support means more distribution opportunities for content creators, without the worry of investing in costly porting projects,” Eric Mauskopf, project manager for Google wrote in a post.
According to the company, spatial audio is necessary to providing a sense of presence within virtual reality and augmented reality worlds.
The open source project will include a reference implementation of YouTube’s Ambisonic-based spatial audio decoder, which is compatible with the Ambisonic format used across the industry. It will also feature encoding, sound field manipulation and decoding techniques, and head related transfer functions to achieve rich spatial audio. Additionally, Google will open its library of optimized DSP classes and functions.
In addition, it is being open sourced as a standalone library and associated engine plugins, VST plugin tutorial and examples.
Since its November launch, Google says Resonance Audio has been used in many applications such as Pixar’s Coco VR for Gear VR and Disney’s Star Wars Jedi Challenges app for Android and iOS.
Other ways Google has been trying to create a sense of presence in VR is through experiments with Light Fields. According to the company, light fields create a sense of presence by creating motion parallax and realistic textures and lighting.
“With light fields, nearby objects seem near to you—as you move your head, they appear to shift a lot. Far-away objects shift less and light reflects off objects differently, so you get a strong cue that you’re in a 3D space. And when viewed through a VR headset that supports positional tracking, light fields can enable some truly amazing VR experiences based on footage captured in the real world,” the team wrote in a post.
As part of its experiment, Google is releasing an app on Steam VR called “Welcome to Light Fields” to show the potential of this technology.