Published in


Wearables in Mixed Reality: Oculus Quest + Samsung Watch

What if you could use a smartwatch in AR/VR? I built an app to test this and documented the whole process — and results — in this blog post.


Recently a few of us on the BadVR team saw a movie where a superhero used a magical talisman around his hand to literally turn back time, and it was so cool we wanted to make it real. The movie was Marvel’s Doctor Strange, and there’s a clip below where Dr. Strange, played by Benedict Cumberbatch, rotates his wrist to reverse the clock on an apple and reveal it being eaten.

So I got a new Samsung smartwatch from the store, an Oculus headset from the den, and wrote a bunch of code to make them talk to each other. The process was very difficult and filled with unexpected challenges that required extraordinary solutions. But it all came together in the end and the result was well worth the effort! By turning the dial on the watch face we could step backward and forward through time on a VR weather map.


Doctor Strange is a mind-bending superhero movie filled with incredible imagery and vivid visual effects. One simple character movement stood out in particular to us while watching, and it wasn’t the teleportation portals or the city flipping, but the little green infinity stone that controlled time.

As a bunch of engineer types, we critically analyzed what was so engaging about this concept and broke it down into a couple elements:

  • precise mechanical control from a rough hand gesture
  • color animation feedback for live motion
  • time travel is the best thing ever

We were determined to manifest this into reality somehow, some way. Well, virtual reality. At work, everyone uses the Oculus Quest, Microsoft HoloLens, HTC Vive, etc. to help customers step inside their data, shape it to their usage, and discover hidden insights. This requires constant testing of new AR/VR experiences like apps and games. But especially games, for research purposes of course!

The path to realization of this vision started with a trip to the local Best Buy to pick up a shiny new Samsung Watch4 Classic smartwatch (they run Android now). And then we grabbed an Oculus Quest 2 VR headset from company inventory (also runs Android).

These specific devices were chosen for technical reasons but the broader concept could apply to almost any smartwatch and headset. In the What’s Next of this post, there is discussion about adding support for other wearables and augmented reality devices. We are just getting started!

Lessons Learned

  • Wearables definitely have a future in AR/VR
  • Oculus mixed reality passthrough has incredible potential
  • Samsung’s new dedication to Android is super convenient
  • Bluetooth BLE is great if/when you can get it working

Build Process

Per recommendation by others, we used mainstream dev tools to build, test, and deploy the applications to each platform. They are all free to download and install, and Unity also has multiple upgrade paths into their commercial offerings with technical support, video tutorials, etc.

Android Studio:


Oculus Quest 2 VR headset

  • supports mixed reality passthrough mode
  • active community and developer support
  • allows Bluetooth peripherals
  • amazing hands tracking
  • super portable

Samsung Watch4 Classic

  • brand new Wear OS 3 version
  • inputs through dial, touchscreen, and side buttons
  • tactile click-click-click for each step of dial ring
  • good battery life
  • easy to develop

Building Android Wear OS Android app

In order to wirelessly communicate between two devices, some kind of communication protocol or system is needed. Almost all electronics these days have Wi-Fi and Bluetooth, and there’s a special mode of Bluetooth called Low Energy or BLE designed for small and fast data packets. BLE is ideal for our purposes of two mobile devices exchanging messages.

In Android Studio, we created an empty watch app and added listeners for rotary, button, and touch events. Then we added BLE Peripheral Mode functions and setup a BLE broadcast server with custom message codes for each input type. Most events were only a few bytes total.

For easier visual feedback, we added background color flashing and text labels for each input event. When rotating left, the screen would flash red and decrement a counter. When rotating right, the screen would flash green and increment a counter. When tapping, the relative screen touch position is shown as a percentage value and the screen flashes gray.

Debugging was done quickly and easily through wireless ADB over Wi-Fi. The watch got its own IP address on the network that the development computer and Android Studio instance connected to. With Debug mode enabled, the app would compile and load onto the watch in about 4 seconds with a final size of ~3MB. Therefore, there was no need for any device emulation.

Integrating Bluetooth BLE into Unity

Inside Unity, we created an empty project based on the Oculus integration SDK, and started from an example scene with mixed reality passthrough enabled. This required OpenXR support, which was a few button clicks away. We followed tutorials from Dilmer Valecillos to get us started on all of this.

Once the basic graphics and animations were in, we needed to get the headset connecting to the watch. Unity doesn’t really have an integrated BLE solution, so we found an inexpensive asset from the Unity Asset Store called “Bluetooth LE for iOS, tvOS and Android” by Shatalmic:

This asset includes native libraries for multiple platforms (including Android) and a unified Unity API interface. Unfortunately, this asset does not support running in Editor mode on a PC, so we had to do full device builds for testing.

The BLE API was implemented in Central Mode such that it acted as a client to the smartwatch’s server and subscribed to new input event notifications. When tapping on the watch face touch screen or rotating the dial, the BLE Peripheral broadcasts a data packet that the Unity app receives as a subscription notification. Binary contents of the notification are decoded into a message payload and Unity events are fired causing the interactive animations. Easy, right?

Unexpected Challenges

  • Unity’s limited Bluetooth support — There’s no “out of the box” BLE on Unity, at least nothing cross-platform. And even testing inside Windows UWP is difficult, so we turned to the Asset Store and found a great option. Unfortunately, that option didn’t work in Editor mode so we were constantly building to device for incremental testing. Anyone know a better way?
  • Bluetooth BLE “Peripheral Mode” — Simple at first glance but actually quite difficult to implement correctly. The documentation is rough and many tutorials on the web are incomplete or outdated. We had to learn some of this the hard way through experimentation, and of course each operating system works a little differently.
  • No USB connection to smartwatch — Typically when developing for a piece of hardware you try to eliminate variables like spotty wireless connections. Unfortunately, for these particular smartwatches, there is no way to directly connect a cable to them, so we had to learn how to use the wireless ADB debugging. Little bit of a learning curve but surprisingly reliable. Kudos to Google and Samsung.
  • Legibility of watch face in mixed reality — Not sure if it’s polarization or brightness or viewing angles, but the face of the watch is tough to read through mixed reality, partly because the Oculus experimental mode uses black and white cameras.
  • Oculus passthrough capture — Since this is still very much experimental, live capture of mixed reality mode is tricky with certain limitations on field of view. We fully anticipate this being resolved soon and found a rather interesting workaround for live capturing both passthrough and real life.

Extraordinary Solutions

To more rapidly test the passthrough mode on the Oculus Quest, I committed a bit of a mortal sin in the electronics world. Sorry but not sorry, was worth it! I found a used headset on eBay that was discounted due to poor cosmetic condition and missing controllers.

Pieces were removed, stripping off the face cushion, the plastic covering, and the lenses covering the LCD displays. The proximity sensor was rigged to always be activated, and holes were drilled through the whole assembly into a swinging microphone arm.

Now it’s easy to capture both real reality and mixed reality at the same time :)

Oculus Quest 2 with lenses removed — don’t try this at home!

What’s Next?

This is the end of one project and the beginning of another. There is really something special about this type of AR/VR interaction, how it activates a device that was previously dormant in the virtual world and opens it up as a new method of interaction. Maybe there are other wearables that would work? Smart rings from Oura and Motiv, fitness bands, an Apple Watch?

We’re planning to extend compatibility to the HoloLens and Magic Leap. If you have a project or product that you think would benefit, shoot me an email at so we can try an integration. Also if you are local to the LA or DC area and watch to check this out in person, reach out for details.


  • Should this project be open sourced?
  • Should this be a video podcast walkthrough?
  • Should it all be packaged up as a product with technical support?

Please share your thoughts and honest feedback (positive and negative) in the comments below.



Step inside your data with BadVR's next-generation virtual reality data visualization platform.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store