Bubbles I/O 2017
Visit g.co/bubbles to blow bubbles with people from around the world
Following up on last year’s Paper Planes, we worked with Google to create another interactive experience that brings I/O attendees together with fans watching the keynote around the world in a moment of collective discovery and gameplay.
Using the microphone or touch on their device, users blow a bubble and send it out into the world. Bubbles from other users float across the screen tagged with data showing when and where they were created.
While the 7000 attendees are creating bubbles, visuals on the big screen show other bubbles pouring in from around the world. Stats display the cities where bubbles are being made, the number of users connected and using the newly updated Google Earth, we zoom in on 3D cityscapes.
Switching to landscape puts the user in the middle of a flowing field of bubbles that can be experienced in 360 degrees using the device gyroscope. Users on Android can enable their camera so the bubbles appear to fill their environment. At the end of the keynote experience we flip to a camera in the stadium showing a view of Shoreline and bubbles from around the world floating across the crowd.
Game mode is a race against the clock to grab as many bubbles as possible for your team. At the start of the game users are split into four teams, a live updates of their team’s progress is displayed on the big screen counting down until there are no more bubbles to grab…
Google Maps geocode and geolocation API’s were used to look up locations and store bubble counts by location in Firebase Realtime Database. Stats of recent locations, and bubble counts by locaton are continually collected and displayed to the user.
We wanted to create an experience that was calming, atmospheric and dreamlike while skirting the line between feeling underwater and in the sky. We used dynamic animating gradients ranging from cyan to a light purple throughout the experience with a screen-space shader that hue shifted from top to bottom and a noise field that continuously animated in the background.
Dynamic cloud systems were built for each section which helped create depth for the camera to move though. The locations section is a combination of layered cloud textures to represent a world map and masked videos from Google Earth. These in work together to create the feeling of flying through the clouds and into a specific location without leaving the sky.
The Bubbles experience leverages a boatload of Google technology like Chrome, Firebase, Google Earth, Maps, App Engine and Compute Engine.
We wrote a base shader that is used in varying bubble types in order to make what is a simple object appear more complex and visually interesting.
The most interesting bubble references we found had a layer of colorful oil that swirls across the radius of the sphere. It would be impossible to calculate a real liquid simulation on each bubble, so we devised some tricks to make it look as convincing as possible.
We render a liquid simulation to a small render target which is used on every bubble. By scaling, rotating, and hue shifting the texture we can create an illusion that each bubble has a unique oil coating. The texture itself is rendered in a “hypercolor” but is then blended with a custom blending function in WebGL to composite with the sky environment and other overlapping bubbles.
We also focused on making the bubbles feel physical and tactile as a way to create the ability to interact with an object that would otherwise not be possible in the real world. For the mobile bubbles, we render high-definition icosahedrons with a uniform array of positions that are affected by the user’s touch.
These positions, that bounce and spring in and out of formation, are weighted to each vertex and the offset from their initial position is applied to that vertex in the shader. When a bubble is forming, it grows and pushes other bubbles away as they bounce off each other.
To generate the shapes the bubbles flow into, we used a simple trick where the pixel values of a black and white image are iterated over and any value that is white gets pushed into a 2D array. This can then be converted to 3D by placing bubbles at a random value within the array and giving a random z position.
For the wind within the simulation, we use a simple noise field that subdivides an imaginary 3D cube into smaller areas called voxels. For each voxel, a noise calculation gives a 3D direction vector. Bubbles within this system find the voxel which they are in and apply that voxel’s vector to their velocity.
Another exciting live installation project made amazing through the contribution of users throughout the world. Thank you to the Google team for the concept and creative direction and Plan8 for the wonderful sounds that filled the stadium during the experience.