Protopian postcard: Augmented Reality

Franc Paul
9 min readJul 13, 2016

The release of Pokémon Go this week and the gigantic amount of combined calories burned by hipsters trying to catch ’em all is a large phenomenon that is getting people to talk about AR; but it is rather impotent compared to the power of the coming storm of Augmented Reality. I never liked Pokémon, but it has proven a popular and enticing basis for getting people acquainted, comfortable and engaged with AR — even using the awkward and limiting mobile phone interface.

I won’t be playing Pokémon, but I’ve been thinking about AR for a while now, and thought I’d share some of the things I am excited about and anticipate will be likely in the not too distant future.

Take me down to Fairy Town

I do not have a very good memory. Presumably because I do not have a minds-eye, and only a very limited ability to have a visual experience apart from direct sensual input; a condition on the spectrum that is recently described as Aphantasia.

Some memories are particularly clear though, not the situational context perhaps, but the content of my thoughts. So, I do not remember what triggered it, but I remember clearly thinking how interesting it would be to someday have toys that are programmed to act like real people, but at a small enough scale that dozens of them can live in a small table-top town or building. They can just go about their business and interactions while kids like me can just attend and observe this micro-ecosystem, learn from the rules that govern them and see how the small-folk react to my inevitable acts of god.

A decade later I was very excited about The Sims, and it really was very good, but I remember being sad that this is likely as close as we’ll get to that table-top town I had envisioned.

Some time ago I heard about the ludicrous amount of money Magic Leap had raised and was immediately intrigued. It took some time for pundits to put together a good understanding of what it is that they were using all this money for. By the time I learned about Meta I was already convinced that the time for the dominance of the mobile phone was soon to come to an end.

Today, when someone mentions Augmented Reality my mind does not immediately go to the productivity enhancements that it will surely enable. My first thought is that this is the perfect platform for someone to finally build that micro-town; filled with autonomous little humanoids going about their business.

Magic Leap promotional image

With AR the possibilities are now endless — the micro-humans do not need to be confined to a micro-town, they are not limited by the physics that constrain us. Suddenly we can see lush environments filled with fairies and dragons — and whatever else Peter Jackson can conjure up — and crucially, we can move around and amongst them.

Also, they could actually be real people.

The scene I see and interact with could be a view into a virtual reality world where people are just going about playing their games, oblivious that their arena is also projected into my senses in just the right way so as to convince me that it is all happening on my living-room floor.

Share and interact with creations pinned to the environment.

Soon people will be able to create intricate and immersive pieces of digital art, and pin them to the physical world. Someone else with the right app will be able to discover these pieces of art, and interact with them, maybe remix them and share them via some telepresence projection with friends.

Buildings and public places will be decorated and enhanced. You’ll be able to see pinned content by tags and genres that you have chosen and artists or producers whom you are subscribed to; or only content with a good community rating. You’ll be able to see the space evolve through time and get a sense of the evolution of aesthetics in the space.

There will be many pinned portals — places where you can see what it looks like right now somewhere else on earth. Maybe someone in Trafalgar Square would be able to see and interact with someone on Table Mountain. You’d be able to have a window in your office that always looks out at the sunset as it is happening somewhere on earth, or down into Central Park, or over a Martian plain, or Rivendell.

Avatars and projected self

Another fun aspect of AR is that we will be able to augment not only our experience of the world, but also other’s experiences of us. We’ll be able to create costumes that we stick to ourselves. It won’t be limited to a visual avatar. You’d be able to select theme music, or some enhancing effect on your immediate surroundings. You’ll be a stream of original content created procedurally by smart algorithms optimised to present your theme to the world.

Imagine a party where the DJ is basically just laying down the beat to set the general vibe of the occasion, but that everyone will hear their own personalised and novel music that is composed algorithmically in real time based on your preferences, social context and environment.

When you meet someone, you’ll be able to hear each other’s voices perfectly while your individual music styles will be blended in real-time to create a new merged genre that could set the vibe for your interaction.

You’ll be decorated in extravagant costumes, which you’ve copied and remixed and modified to fit your personality and style. Everyone and everything will be an artwork and contribute to the greater experience of everyone else.

Telepresence conversation and sense sharing

An obvious use for AR will be to reduce the sense of distance from telecommunication.

Since landlines we’ve come a long way with mobile phones, Skype, FaceTime and video conferencing, but we are about to have daily calls with family, friends and colleagues that will feel more personal and direct. Twenty years ago we were all quite happy with predominantly using land lines. In 20 years from now we won’t be using mobile phones. We’ll be connected in much more ubiquitous and intuitive ways enabled by AR.

With VR we will meet distant loved-ones in a virtual space where we can interact etc, but with AR we will bring them into our space, and we will be in their space. We’ll be sharing meals and intimate moments in a natural way.

There will be cameras and connected sensors everywhere and your devices will be able to use them — as well as mirrors and other reflective surfaces — to build very believable representations of you in your environment, and transmit the essence of you into the senses of people you are having a conversation with.

I’d be able to choose a particular mirror as the input for a business call, or allow full body surround projection, like holograms for family calls.

Your toddler would be able to visit with her grandparents on the other side of the world. They’ll not only see each other but be able to interact freely. Eventually, they would even be able to watch her while you are busy elsewhere.

When I’m struggling with a problem I’d be able to share my current view with an expert. She’ll be able to see what I’m seeing and even where I’m focussing and would even be able to talk me through the next steps required. She could even use hand gestures -overlaid onto my field of view — helping me to direct my actions.

We’ll be developing new senses, and become more social, sharing our moments in new more intimate ways.

If I’m in a new place, I’d be able to live-share not just video ala Facebook or Meerkat, but you’d be able to see what I’m seeing, hear what I’m hearing, and eventually other senses too — smell, taste, even proprioception. People will pay good money to follow along with other people’s lives.

Persistent first-person reality streaming will be a thing. It will do strange things to our sense of self.

Extra-cortical computational filters — extended brain

Of course it is not only about increasing input to our senses: we will be adding a bunch of new senses too.

Our biological cortexes are collections, layers and hierarchies of filters and algorithms that are managed by various levels of inhibition, analogy, feedback, consolidation and reinforcement. The more your mastery in a subject increases, the more you depend on the patterns and algorithms that you have built up over time by experience. Those algorithms are coded into neural firing patterns at ever higher levels of the cortex. This allows the brain of an expert to make quicker, more efficient and better judgements about situations in her domain of expertise than someone who is not an expert.

Our devices will increasingly become portals to the cloud that will enhance and extend our own capabilities and add patterns, filters and queues that will allow our brains to build mastery faster, and to outsource some of the benefits of experience to the algorithms in the cloud.

Concretely, when I fly my hang glider, a HUD will inform me of my current airspeed, altitude, sink-rate, orientation and likely direction and distance to house thermals that I can use to gain more altitude. I’ll be a more capable pilot and my brain will be able to focus more attention on the experience of flying and the motor controls needed for effective manoeuvring of the aircraft.

We’ll shop for new apps and filters that will enhance our capacity, capabilities and experience of the world around us. We will free more of our brains to do things we can’t imagine yet.

Biosensors for feedback and better health outcomes

AR will not only be entertaining, and allow us to accomplish more in less time, it will be truly good for us. We’ll have constant feedback and insights of critical biological metrics. Not only will this allow us to build intuition for our own physiology, anticipating what the effect of our actions and environment will be on our heart rate or metabolism in real time, but we’ll also be given real personalised information and predictions based on our specific histories and biology that will guide us towards better mental and physical health outcomes.

As a society we’ll be running millions of N=1 medical studies, and we’ll be able to combine that information into a real understanding of our species. Our data will inform and train algorithms allowing them to make better analogies and predictions.

A beautiful new world

Soon my bad memory won’t be a disadvantage. I’ll be prompted and reminded, my memory will be augmented by feedback, analogies, predictions and new senses from my devices overlaying my biological senses and the cloud algorithms enhancing my cortical abilities. I will actually be able to see visual analogies play out before my eyes and my Aphantasia will be a thing no more. We will all share an entirely new, rich, vivid and fantastical world.

I can’t wait.

What excites you most about AR? How will it change your industry?

--

--

Franc Paul

I wrangle with complexity, hang gliders, motorcycles, horses, people and code. Sometimes I share some of my thoughts about our journeys in Protopia.