Micro Experiments in AR Design — Part 1
Looking to create connections
This is the first article in a series where I share small experiments in Augmented Reality design. I explain my thought process for each, and link to the wiARframes for you to experience in the app, and remix online. I hope these can serve as inspiration for your own creative exploration.
Who moved my cheese? 🧀 🐁 🐈
I’m obsessed with interactivity, and whilst a lot of AR still focusses on displaying visually pleasing objects, I’m personally most interested in how these objects then interact with each other and with us.
In this experiment, there are three actors, the cat, the mouse and you. The cat simply watches the mouse (it’s rotation bound to look towards the mouse). Similarly, the mouse always faces the cheese, but also moves towards it (velocity in facing direction). Then we have you, you participate in this experience by moving the cheese.
Whilst this is very simple, it is one of the techniques we can use to create a sense of connection between us and the objects in our augmented layer. In a full app we could even replace the virtual cheese with a real detected object or tracked image.
Eye Spy 👀
In the previous example, we bind the rotation of an object to that of another object in the scene. We can however, also bind the rotation of the object to our position or rotation i.e. we can have an object look at our position, or match our rotation.
In this experiment , there are two eyes. The first, watches us wherever we go (rotation bound to look at the camera). This works well with 2D panel images too — often called billboarding.
The second eye is matching the rotation of the camera. The difference can actually be quite subtle as we tend to adjust our own rotation to look in the direction of the objects.
A Responsive Photo Frame🙈
So far we’ve looked at objects looking at other objects that we control, and objects looking at us. The flip of this, is us looking at objects, and having that change the objects. I explored this in a lot more detail in my previous article on mobile AR gestures but wanted to recreate a basic version here.
In this example, there is a photo frame, with a picture of a friend. She doesn’t notice you until you look at her, in which case she’ll smile. If you stare though, then she’ll get a bit annoyed, she’ll relax again when you stop looking.
This can practically be used as a way of highlighting and selection by the user but it can also just be a neat way to create a connection with a virtual object.