Velos Mobile
Published in

Velos Mobile

How to Set Up a 360 Video Experience in 2D Using Google VR SDK

Photo by Patrick Tomasso on Unsplash

Google offers the Google VR SDK for Android to allow you to build apps for Daydream and Cardboard. You can also use this SDK to create 360 experiences in both VR and 2D. The great thing about using Google VR SDK for Android is that it allows you to maintain a purely native codebase while providing a rich VR or 2D user experience.

Note: The Google VR SDK for Unity may be a good choice for you if you plan to share code between iOS and Android platforms, or if your app is mostly VR-based. Another option to integrate 360 Video experiences is using the YouTube SDK.

Unfortunately, the Google VR SDK is quite old, and written entirely in Java. In this quick post, you will see our adaptation of the Google VR SDK source files in order to use them for a 360 2D experience.

*Disclaimer: All code is originally from the Google VR SDK. I have heavily adapted this code to fit our needs.

Goal: Show 360 video in 2D in a pure native Android app.

Note: We will not be reading the video from storage or through intent as in the Google VR SDK. We have simplified this tutorial to just take a 360 video from res/raw/ and show it in a 360 view.

A demo of this sample app can be found here:

Let’s begin! This post assumes you have basic Android development experience.

Step 1: Go ahead and clone this project from here.

You will be copying over a few files in Step 6. You may instead skip this step, and grab the files directly from the Github repository when you reach Step 6.

Step 2: Download a sample 360 Video

We used this video, but any 360 video that normally works in a 360 player should work. We used the original resolution for this video.

Name it vrsample.mp4 and place it under res/raw/ directory. You may need to increase your Android Studio heap size to accommodate this file.

Step 3: Create a new project in Android Studio.

The minSDK should be set to 24. The 2D version of the 360 experiences work on devices that meet this minimum.

Step 4: Add Dependencies:

Add Google VR SDK dependency to app build.gradle:

At the time of this writing, we needed to use v1.190.0 of the Google VR SDK.

We will also need Kotlin Coroutines, as we’ve replaced the original AsyncTask for loading videos to use a coroutine.

Step 5: Setup an activity or fragment to be able to render the video.

In the sample project, we use activity_main.xml to include the VideoUiView, as well as MonoscopicView. Go ahead and copy activity_main.xml as well as video_ui.xml.



MainActivity sets the view to activity_main, which contains a MonoscopicView and VideoUiView. It configures the MonsocopicView, and launches a coroutine which loads the video sample.

Remember to do videoView.destroy() inside the onDestroyView() method of a fragment, if you choose to show the video in a fragment instead.


Step 6: Copy remaining files.

The rest of the files remain pretty much the same. Go ahead and copy the remaining files: Mesh.kt, SceneRenderer.kt, Utils.kt, MediaLoader.kt, MonoscopicView.kt, and VideoUiView.kt. These will remain as they are, unless you need any customization.

Here is a brief explanation of these files:

  • VideoUiView is just the UI related to controlling the video playback. It includes a seek bar, as well as text showing how many seconds have passed.
  • MonoscopicView renders the GL scene, and calls MediaLoader to load the video.

The entirety of the rendering package is the math and rendering utility methods required to load a video:

  • Mesh is a utility class to generate spherical meshes for video and image.
  • SceneRenderer will actually render this mesh, as well as the GL Scene using utility methods from Utils.

In order to render your own sample 360 video, make sure it is included under res/raw. Pass it as a parameter to loadMedia() function, along with the horizontal degrees. The horizontal degrees is set to 360 if you allow the user to have a 360 view (if you’re using a 360 video). This can be set to 180 if you plan to render a 180 video instead.

Hardware/Software Requirements:

  • As noted previously, the minSDK is set to 24. The 2D version of the 360 experiences work on devices that meet this minimum.
  • If your screen allows 1080p or better, it should be sufficient to view a 360 4K Video.

The entire sample project can be found on Github.

We hope that this sample project makes your 360 video integration in Android easier!

Velos Mobile is a Mobile Design & Development Agency in San Francisco. Drop us a note if you would like to chat about anything mobile!




We are a group of technologists who are passionate about creating excellent mobile experiences. We’ve been doing this for a long time, and we love sharing our expertise from the first wireframe to the last bug. Check us out at

Recommended from Medium

8 Reasons Why PHP Development Is Not Dead

Accessing and Filtering emails using Python

[< Kasoor->C language>] [<Heer->Data Structure>] [<Ranjha->Algorithm>]

Migrate your website From Drupal 7 to 9?

The Benefits of Coffee for Developers

MLOps Task 1 :

Using flexbox for a Dynamic Top Bar

Password Authentication With Python: A Step-by-Step Guide

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Seetha Annamraju

Seetha Annamraju

Android developer, traveler, mentor, vegan foodie.

More from Medium

Exploring Material You for Jetpack Compose

Using Webview in Jetpack Compose and enforce Dark Mode

初學 Jectpack Componse on Wear 3.0

Hands on Jetpack Compose VisualTransformation to create a phone number formatter