Exploring Spectacles as XR Prototyper

Here is a short walk of how I approached Spectacles 2021, came up with ideas, and built prototypes.

A little background

Snap is sending Spectacles 2021 to selected creatives. As far as I know, typically, it’s not paid partnership. You can apply to get the pair of glasses here:

I personally was keen to get them since they were released. I was sending emails and applying multiple times until, one day, I got in 😜. I’m still trying to figure out what worked for me. The only advice I could give is to start building and sharing stuff to get noticed. Anything similar would work: AR lenses for Snapchat or VR interactions for Quest. The main idea is to show that you can and want to produce spectacular stuff.

I won’t do another Spectacles review, but I want to share my insight as I learned a lot about them.

Instead of going full-blown version of future glasses, Snap build a comfortable and friendly device to wear. At least as long as battery and heat buildup allows you. For example, I had them on me while travelling on a train, and no one even looked at me. I definitely would get more attention with, let’s say, Magic leap. Even turning glasses on takes seconds, not minutes.

The second huge plus is the creation pipeline. Yeah, they’re locked in Snap’s ecosystem, but Lens Studio is outstanding. After switching back to Unity, I missed so much just hitting the “Send to Devices” button, which just worked.

When I got the glasses, I came up with the following objectives:

Learn. Needless to say, about the Snap ecosystem, but also test concepts I can apply to other platforms.

Have fun. Of course.

Get real projects. As an independent contractor, I just can not play with the new hardware all the time. I need to have real projects to pay my bills. So, I wanted to use the exposure from these experiments to lead me to new compelling contracts.

As usual, I’m working on such side projects on weekends or free time, so I’m trying to keep the scope as lean as possible. Looking back, I noticed that I often automatically apply the same framework to all my experiments. I pick up one technical challenge and one use case to test and tried to avoid the rest. I’ll explain it more in the examples below.

1. Baby steps

I was acquainted with Lens Studio, as I explored it a lot while working for Verison Media, building their tool for creating AR experiences. But I had very little hands-on experience. To get an idea of how things work, I wanted to have the entire development cycle completed asap. I took an old lens my wife Inna built for Snapchat last year and tried to make it work on Spectacles. I had the pipeline figured out in no time and went outside to test it in the wild.

I found out that having objects just flying around isn’t the best approach, as it’s easy to lose them. Also, when an object moves out of FOV, it just emphasizes its boundaries. Latter, I always tried to keep the visuals small and local in further prototypes.

Tech challenge: figure out the pipeline

Use case: large 3D scene outside

2. Marker on guitar

Huge thanks to Snap for preparing the templates. I was able to build the next experiment literally the next day.

Here is a good example of cutting corners: For dots on the guitar that show me where to press, I just made a video on black background, made it loop, changed the blend mode to lighten, and connected it to the marker. I didn’t even try to build the whole experience, as my goal was just to test how it felt.

Actually, tracking wasn’t sufficiently confident, and in general, the experience didn’t feel good enough to continue working on it.

Tech challenge: marker tracking

Use case: music playing learning (guitar)

3. AR train

Then I had to travel to Berlin. The day before, I saw a tutorial about Dynamic texts. It’s a simple but interesting feature that allows you easily get users’ info like location and whether as text. So, I was looking for a use case where I could use it.

So, while on a train, I thought it would be cool to know the cities I’m passing through and other relatively useful info. I pulled out my laptop and had the lens ready in less than 5 minutes. I had a hard time connecting my Spectacles to train wifi, so I recorded video using Snapchat on my mobile. Later I released a similar lens for Spectacles.

Tech challenge: dynamic text

Use case: contextual info (in train)

4. Outdoor test

When I came back before going for a run with my dog, I decided to test how Spectacles can track movements in world space. The white snow cover is supposed to make it useless. I did not even build anything and just used the default running app by Snap. And it worked unexpectedly adequate. The tinted glasses were comfortable to wear outside. Obviously, the battery lasted only a short time, and I did not have the case with x4 charges with me.

Tech challenge: pick the suitable lens :)

Use case: use for sports

5. Crypto

This exploration was technically a descendant of the guitar demo. I wanted to find a way to localize AR info and keep testing markers. The use case is simple, based on a recognized marker showing the value of the banknote cost in crypto.

Marker tracking worked pretty well. I even built the functionality to calculate the combined value of all visible banknotes but later found that only a single marker is supported.

Also, the data wasn’t dynamic. So, the lens I published had an accurate conversion rate only on the day I created it :)

Tech challenge: multiple markers

Use case: cash to crypto

6. Jump

After discovering that world tracking works excellent, I wanted to keep testing it. The obvious use case should be related to sports. Adam did a remarkable series of explorations in this direction, so I had difficulty coming up with something new :)

I’ve been doing various technical tests. While measuring the vertical position of the headset in the world space, I noticed that it was very accurate. Later I built the app to test vertical jump height, and we went to trampolines to stress test.

It was a success: technically, the glasses worked faultlessly. The use case was interesting, and glasses were no barrier to moving at all.

Tech challenge: accurate world tracking

Use case: use for sports (jumping on trampolines)

7. Trackpad limits

I’ve been thinking a lot about input. I always feel that voice has too high a margin of error. There is hand tracking, but it could be better than on Quest, and I played too much with hands demos anyway. The trackpad on the side of the glasses is treated more like a temporary workaround, but I wanted to see if I could squeeze a little more out of it.

After building a quick tech demo, I found out that its accuracy is quite good. So, I decided to create a numpad for inputting numbers (like for unblocking the device). There are better ways to build the mechanism of unblocking glasses, but it’s a perfect use case for testing the trackpad’s accuracy limits.

And it worked okay-ish. I could enter desired numbers with unexpectedly high accuracy. A couple of interesting details:

  • As I do not know where my finger would land on the tiny trackpad before I touch it, I would only show the cursor when it’s touched and, on release, activate the button. It worked nice.
  • The trackpad has two parts and a couple of system gestures. I used only the bottom 50% of the trackpad to avoid accidentally triggering them. It still was more than enough.

Later I decided to take it a little bit further and create a full qwerty keyboard as a joke. And I got to say it wasn’t as bad as I expected. I still could type anything I wanted.

Tech challenge: explore trackpad

Use case: input method (pin numpad & qwerty keyboard)

Crypto with API

On Lens Fest, Snap released a couple of external APIs. The crypto API was as created for my previous crypto demo. So, I quickly figured out how it works and made one more “useful” public lens over the weekend — a converter of USD banknotes to crypto.

Tech challenge: connect external API

Use case: cash to crypto

After six weeks, I had to return the glasses. Here are some numbers:

- 7 prototypes built

- 5 published lenses

- 11 videos were recorded and shared on social media

> 700k impressions on Twitter on one of the videos

> 450k views on Linkedin on one of the videos

> 100 people received a pitch about how cool are Spectacles

< 1 hour spent getting started with Lens Studio

> 10 hours waiting glasses to cool down

∞ fun during creation

And most important, did I achieve my main objective?

Even overachieved. After seeing my stuff, some great companies contacted me online, which led to multiple exciting gigs, which grew into, unimaginable before, long-term partnerships, which kept me so busy that I got time to publish this article almost a year after I started drafting it 😅



Learn about user experience in augmented and virtual reality

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store