Cannes Lions
GOOGLE CREATIVE LAB, New York / GOOGLE / 2018
Overview
Entries
Credits
Description
Since 2012, the Curiosity rover has been exploring and photographing Mars, searching for signs that it was ever suitable for life. Scientists at NASA’s Jet Propulsion Lab use these photos to create a 3D model of Mars. It’s a one-of-a-kind scientific tool for planning future missions.
We put that same 3D model into an immersive experience for everyone to explore. We call it Access Mars, and it lets users see what the scientists see. Check out Curiosity’s landing site and other mission sites like Pahrump Hills, Murray Buttes, and Marias Pass, plus a near-live look at where the rover is right now. JPL scientist Katie Stack Morgan explains key points about the rover, the mission, and some of the early findings along the way.
It's all built using WebVR, a technology that lets you see virtual reality right in the browser, across many devices, without installing any apps.
Execution
Access Mars is a WebVR experiment accessible by latest generation devices with a Chrome browser: 360 view on desktop, mobile magic window, Cardboard, Gear VR, Daydream, Oculus, and Vive.
We placed the orientation 2d film within the 3d environment. The 2d film then seamlessly transitions into the full 3d immersive environment rendering live with Javascript. This is so that a user only has to click Enter once before entering VR.
All design elements were carefully considered for maximum legibility in the virtual environment. The location specific POIs were inspired by some of the key geological markers of each location. The POI content was developed in collaboration with and voiced by JPL mission scientist Katie Stack Morgan.
We built a custom THREEJS render pipeline which allows realtime JPEG texture (ProgressiveTexture) decoding and loading without impacting the framerate. This allows us to load low-resolution textures initially, then load high-resolution textures in the background as the user explores.
To allow for real time interaction, we created a simplified terrain mesh specifically for collision measurement. We took a terrain mesh with a lot of polygons, and generated an invisible copy with less polygons. This enables us to test for collisions on the invisible lower-polygon mesh must faster.
We simplified the shaders to ensure best performance across many devices with variable processing power. The lighting on the rover is simulated using a simple lambertian shader, rather than the built-in THREEJS lights and shaders. The lighting on the terrain is baked into the textures by JPL as part of their photo asset preprocessing that we also helped develop.
Similar Campaigns
12 items