Augmented Virtuality

(more on “mixed reality”)

Dario Laverde
7 min readDec 29, 2017

I’ve recently been asked by several developers about using the HTC Vive’s front facing camera. Questions such as “How do I implement SteamVR’s Tron view in my own app?” or “How do I use WebCamTexture or Texture2D with Unity and [Vuforia or OpenCV or other solution] ?”

Let’s get a basic working implementation of a front facing camera filter like this black and white comic filter example found in OpenCV for Unity.

front facing filter fun

You can download the executable here to see the filter in action.

For camera access, there are two approaches with Unity: Texture2D or WebCamTexture .

Texture2D with TrackedCamera

In a previous article I’ve covered the basic use of the front facing camera (also referenced as TrackedCamera) by using the OpenVR API which we mapped to a Texture2D texture in Unity for a heads up display example (and since then the SteamVR Unity plugin also provides a similar example).

placing the controllers in front of a camera view that follows the HMD

WebcamTexture with Webcams

Using a WebCamTexture, you can access the Vive’s front facing camera as if it were just another webcam. You’ll find several examples out there but in practice this gets tricky for a couple reasons:

  • It may not be the default camera, so you may need to allow the user to select the camera or search for it by a specific name, “HTC Vive” in this case. You may also need to activate the cameras in a certain order or activate it externally (I had an issue on Windows 10 where I needed to toggle through the cameras).
  • Most importantly, for WebCamTexture to work at all, you must disable the camera in SteamVR settings. This is not intuitive to the user to say the least even though you could prompt the user to turn it off or manage this setting yourself (both requiring a SteamVR restart). The reason is that SteamVR holds on to the ‘webcam’ camera handle and shares it (e.g. you can run both the compositor’s camera view and your view at the same time).

But if you do want to use WebCamTexture or need to provide access to additional 3rd person view cameras, then technically yes it’s possible and most of the available examples should just work, just remember to disable the camera from within SteamVR settings.

There’s now several OpenCV Unity plugins on the AssetStore or you can try to wrap OpenCV on your own or build on top of an open source implementation for C# e.g. http://github.com/shimat/opencvsharp

We’ll primarily walk through the ComicFilterExample from the OpenCV for Unity plugin. (note: you should be able to make this work with other plugins)

This example originally uses WebCamTexture but we’ll change it to use a Texture2D instead. For reference I’ve included the scripts that work with WebCamTexture with the full project here.

Here’s our scene:

You may notice I’ve also included the Vive Input Utility plugin, a VR toolkit that makes it easier to support controllers, trackers and much more. In fact we will only be using it for one line of code to handle a controller button to toggle the comic filter off or on (I suggest leaving the filter on by default as shown above).

Vive Input Utility provides several useful prefabs, the ViveCameraRig prefab dropped into the scene above corresponds (and wraps) the CameraRig prefab from the SteamVR plugin.

The ComicViveFilter script shown in the inspector is attached to a quad which is our camera view, a child object of the Camera (hmd) object so this will follow your eyes. The transform’s scale however is hard coded here to the front facing camera’ s resolution and placed at a distance in the z direction so that it just covers your entire field of view.

Now because the front facing camera is pointing downward we’ll need to compensate by lowering the Quad in the Y direction so that the view can match the actual position e.g. calibrated using the controllers.

Unfortunately this leaves a gap at the top of the view. But once you have this calibrated we can calculate the camera’s downward angle and it turns out to be about 30 degrees. In this scene I removed the skybox and made it a solid black color background for this specific effect filter.

I’ve divided the code in the script so that we can focus on the OpenCV part, the Comic filter, rather than the OpenVR TrackedCamera code.

Here’s Start and Update:

private void Start ()
{
size = initTrackedCamera ();
initFilter (size);
}
private void Update ()
{
byte[] framebuffer = updateTrackedCamera ();
if (framebuffer != null) {
if (enableFilter) {
updateFilter (framebuffer);
} else {
texture.LoadRawTextureData (framebuffer);
texture.Apply ();
return;
}
}
// toggle filter with a controller
if (ViveInput.GetPressDown (HandRole.RightHand,
ControllerButton.Trigger)) {
enableFilter = !enableFilter;
setupTransform (enableFilter);
}
}

In Start we’re obtaining the size of the frame when initializing the camera and then initializing the data structures (specifically the OpenCV matrices) based on the frame size. In Update we’re obtaining the frame buffer from the camera on each frame. However only if the previous camera frame changed, so if running the camera at 30Hz at an overall 90fps rate, you’ll only filter for about a third of the time. Lastly we’ll allow to toggle between filtered and non-filtered views and adjust the transform accordingly (just one way of correcting a raw feed that is upside down or reversed).

From WebCamTexture to Texture2D

In most of the OpenCV examples you should be able to remove references to WebCamTexture and skip to providing the first Mat (rgbaMat in this case) the camera’s frame buffer as shown below:

private void updateFilter(byte[] framebuffer)
{
rgbaMat.put(0,0,framebuffer);
Imgproc.cvtColor (rgbaMat, grayMat, Imgproc.COLOR_RGBA2GRAY);
bgMat.copyTo (dstMat);
Imgproc.GaussianBlur (grayMat, lineMat, new Size (3, 3), 0);
grayMat.get (0, 0, grayPixels);
for (int i = 0; i < grayPixels.Length; i++) {
maskPixels [i] = 0;
if (grayPixels [i] < 70) {
grayPixels [i] = 0;
maskPixels [i] = 1;
} else if (70 <= grayPixels [i] && grayPixels [i] < 120) {
grayPixels [i] = 100;
} else {
grayPixels [i] = 255;
maskPixels [i] = 1;
}
}
grayMat.put (0, 0, grayPixels);
maskMat.put (0, 0, maskPixels);
grayMat.copyTo (dstMat, maskMat);
Imgproc.Canny (lineMat, lineMat, 20, 120);
lineMat.copyTo (maskMat);
Core.bitwise_not (lineMat, lineMat);
lineMat.copyTo (dstMat, maskMat);
Utils.matToTexture2D (dstMat, texture);
}

After some OpenCV image processing calls and gray and mask pixel manipulations, we arrive at our destination Mat, dstMat. Fortunately the Utils class can convert our Mat directly to our Texture2D texture on the Quad.

Hints and optimization tips

The front facing camera was not meant for AR, so plan accordingly. Augmented Virtuality is about adding reality (the camera feed) into VR rather than the other way around. You’re in a play area that remains the same most of the time so use filters to peek into your actual play area rather than exclusively making a full screen view of reality. In this example we toggled back into full screen reality which isn’t ideal with this camera, so toggling into another filter or a partial or blended screen view within your virtual world would make a better experience.

Full screen filtering of reality looks cool initially but can grow old quickly so experiment with portals, windows, periscopes, sunglasses, etc. that partially use a view of reality especially within the context of your story or game.

If you have enough performance bandwidth and need to detect motion (e,g, for adding safety, avoiding a pet or adding a tracker to a real object) you could simply turn on the existing “tron” view mode already built into SteamVR using the OpenVR APIs as your alert:

EVRSettingsError SetTronMode(bool enable)
{
EVRSettingsError e = EVRSettingsError.None;
OpenVR.Settings.SetBool(OpenVR.k_pch_Camera_Section,
OpenVR.k_pch_Camera_EnableCameraForCollisionBounds_Bool,
enable, ref e);
OpenVR.Settings.Sync(true, ref e);
if(e==EVRSettingsError.None)
Debug.LogError("error setting tron mode");
return e;
}
also see Unity's blog:
https://blogs.unity3d.com/2017/06/16/codesnippets-toggle-vives-front-facing-camera-and-tron-mode-at-run-time

This effect filter was a basic example of using OpenCV with the camera but generally not how one should implement a screen or camera filter. You can use post processing and screen shaders instead because as you may have noticed even though OpenCV makes use of the GPU this comic filter also used the CPU which isn’t good if you‘re concerned about performance. With the provided material on the Quad in this example, not only can you add a shader for a filter but you can also add a static image to this material’s texture (requires setting up the image importer to the correct size and image type) for easier debugging without a camera.

debug without a hmd by using a static image for your material’s texture, and select a shader for a filter

There are also additional optimizations you can try such as making use of LateUpdate and using dithering and blue noise to reduce camera image artifacts when there’s motion (there’s no easy next frame prediction unlike moving tracked objects).

The quad also didn’t need to be placed far away to allow you to add objects in front of it. You could dedicate a layer to it to make it not occlude / cull objects in the scene with the camera. In this example we used an undistorted frame type from the front facing camera but you should experiment using a distorted frame type. Oh and a lit shader is preferable than lighting the quad with a spotlight.

Hopefully you can use your imagination for when and where you could inject some tiny doses of reality from the front facing camera into your virtual experiences. If you do, I’d like to hear about what you’re working on.

--

--

Dario Laverde
Dario Laverde

Written by Dario Laverde

VR/mobile developer, community leader and director of developer relations at htc