4 Lessons That I’ve Learned in Building AR Basketball Project

Liwei Yu
Inborn Experience (UX in AR/VR)
8 min readSep 23, 2017

Recently I’ve been spending my spare time working on an AR Solo Basketball project with Unity and ARKit. (I was honored to see that it was featured by @ARkitweekly twitter & Product Hunt Daily.)

Featured in ARkitweekly twitter

It was the first AR project that I worked on independently from scratch. As a designer, I’d like to share some design thinking from the problems that I’ve solved in this project.

Please take a look at the short demo before continue reading:

https://youtu.be/iS1ZciI0ezc

The first time I saw ARKit, I was surprised by its real-time tracking of the real-world position of your camera(Visual Inertial Odometry). I later utilized this feature to keep track of the shooting heat map in my project. The lighting system is also based on the real-time environment, including render quality, etc; which also provides a simple particle system and blueish circle for visual feedback, this feature could also be utilized later for onboarding education purpose for users. The cons of ARKit, however, at this point are— first that you need to use your mobile device as the display, so the range of AR world you can see is limited by the device screen size. Second, the interaction is pretty limited unless onecould leverage third-party wearable hardware. Finally, tt also drains the battery very fast due to the heavy usage of the GPU.

FTUE(First Time User Education) design - Lesson I.

I realized that many of my peers who also worked with ARKit had the same hair-pulling problem — How to educate new users to use it properly? In order to get the model displayed properly, ARKit first requires users to find an ideal surface so the system can detect it to serve as the base ground for your model in the real world. If the model is very large, you need to guide users where to look due to the limited screen size. Any unclear guide could therefor easily confuse users.

So what kind of surface is an ideal surface? People in developer communities recommend that the environment is filled with good high-contrast lighting because the more visible detail of the surface texture, the faster the detection speed. For example, a grainy wood floor works fast, a plain pure white floor would be much slower.

So when designing the FTUE of this project, the first problem is to help users quickly detect the floor as the base of the court. We have to consider which type of information we should use: Screen Information or Environmental Information. Screen Information is the information that is fixed within the screen. Environmental Information means the information that is displayed on the objects in the environment, you can’t see it if it’s out of your sight. So in the very first step of this project, it needs to rely on Screen Information to educate users how to assist ARKit to read the environment. In the first version(Fig.1) the following instructions are given to user: “When you see a blue rectangle, tap it. If don’t see, stay closer to the floor texture.” I modified the ARKit default blue square pattern by adding “Tap” text as well as a circle shape as the visual cue. This is the first Environmental Information.

1st version of detect floor education
Fig.1–1st version of detect floor education

But after the first usability test, I got some feedbacks like “Why I don’t see anything?”, “What is that particle” and “What is that blue square?” The reason is that all users are brand new to ARKit, no one is equipped with the knowledge of the detection process. They didn’t even know where to look when they open up the app, not to mention successfully seeing the blue circle.

In next version I added a guide card(Fig.2)— tried to educate users where to look at and what the particles and circle are.

Guide card
Fig.2 — Guide card

In the second usability test, I found that users understood that they needed to point the camera at the angle like the graphic displayed. They also were able to know what the particles and blue circle represented. But the detection success rate was still very low, due to the angle shown in the graphic was not actually optimal to detect the floor.

After I did some more research and practice, I found that when you point to the floor and start to see some particles, then getting closer to the floor would make the number of particles skyrocket, the blue circle will immediately show up on the display. So I changed the guide to do the same in the third version(Fig.3).

Guide card iteration
Fig3 — Guide card

Instead of educating users on the function of the particles and the blue circle, I went straight through to tell users the fastest way to find a suitable floor. Like the graphic shows, you just need to point the camera at this angle.

The test result was much better, users were able to detect the floor faster by just glancing at the guide without even reading the text. Compared to the previous versions, I gave up on educating users of the technical knowledge and instead just educated them point the device properly. I then realized that as an independent developer and designer, I spent too much time trying to understand the technology and its background. I forgot users are normal people, I tried to educate them the way that I learned, and it made me forget the core user goal.

After user dismissed the card, they would see the following guide(Fig.4). I removed the blue rectangle, added a grid and a tap target icon on the circle for the visual cue, refined the copy to be more instructive.

2nd version of detect floor education
Fig.4–2nd version of detect floor education

So by tapping the blue circle, the court would show up, which leads to the next lesson that I wanted to share.

FTUE(First Time User Education) design — Lesson II.

It is a classic AR/VR problem: In AR/VR world, the entire world could be interactive, but users’ attention is usually very limited by the current task and the scenario. So I chose to animate the court to float up from the bottom(Fig.5).

Basketball court is floating up
Fig.5 — Basketball court is floating up

The first reason is that users’ attention is on the floor in the previous step, so I use the float-up animation as a interaction feedback.

Another reason is that it prompts users to look up as its momentum is moving upward. I used no animation in the first version butusers did not understand they needed to look up for the basketball rim to set. Users instead kept tapping on the court, thinking there would be more objects showing up.

The problem was that since my model is pretty big, users could not continue without looking up. So the FTUE text changed to “Look up” at this time. You might have noticed that the text is Screen Information, but I’d prefer to use Environmental Information by displaying an arrow showing “look up” after the court appears because it’s more contextual and obvious.

As users look up, they could see the court setup animation.

As the court is setting up, the instruction also changes
Fig.6 — As the court is setting up, the instruction also changes
Court setup overview
Fig.7 — Court setup overview

FTUE(First Time User Education) design — Lesson III.

Now the Screen Information changes to display “Swipe up to shoot”(Fig.6). This part is the most intuitive portion, as swiping up to toss the ball has been used in many mobile games. I believed that users would just do it without a second thought. I placed the information at the bottom of the screen in the first version but it turned out nobody ever recognized it was even there. This was because users mentioned that their attention was on the ball and the rim while their fingers blocked the text. I moved the text closer to the ball so that it fell in the attention area and was necessary to increase the text readability by adding shadows or a “swipe-up” icon because the text would often merge into the background environment.

FTUE(First Time User Education) design — Lesson IV.

When users shot the ball, their attention would focus on the movement of the ball, and the problem was that most users did not even know they could step out to grab the ball, which supposed to be the most exciting feature of ARKit (tracking your position in the real world). So they kept tapping on the ball in the screen trying to pick it up, like the traditional game. I then changed the Screen Information to “Now move your body to grab the ball”. But still not worked well.

I then decided to place a real-time footprint line(Fig.7) on the floor that connects both the ball and the position where you’re standing. This could give users a strong visual cue that “please go grab the ball”. The result of the next usability test showed a much better result — every user walked to grab the ball without a second thought.

Visual path indicator
Fig.7 — Visual path indicator

One last thing I want to share is the ability to keep track of your shooting heat map(Fig.8). This feature shows the magic of ARKit, and have solved the problem of how to let users see the performance.

Court shot heat map
Fig.8 — Court shot heat map

The red cross mark on the floor means you didn’t make it, while the blue check mark means you made a basket at that position. Thanks to ARKit which keeps track of the camera position, it allows users to view the heat map just like the NBA metrics. That’s why I encourage users to walk around to shoot instead of standing at the same position and keep retrieving the ball.

Total running distance
Fig.9 — Total running distance

Another way to leverage ARKit to bring more fun to users: I designed a feature that keeps tracking users moving distance, and it’ll show the total running distance at the end of the game.

In conclusion, ARKit so far is a pretty powerful tool and there could be so many imaginative projects one could develop around it. Before the technology becomes more mature(users don’t even need to take extra steps to detect the environment), it is worthwhile to spend some time working or practicing with the FTUE for new users.

You may try it in app store: https://itunes.apple.com/us/app/ar-solo-basketball/id1286833218?mt=8

--

--