Augmented Reality Projection Brokerage System Using Facial Recognition

If we assume that it will only be a matter of time before we are able to project augmented reality holograms onto translucent surfaces, liquids, or gases then we need to accept that it will not be easy to deal with the issue of showing the right person the right image.

In the world of smartphones, we can guarantee that we are looking at the image that is intended for us because we carry the phone around with us. It is tethered to us, so to speak, as opposed to a device that is used by multiple people and is tethered to a particular location. While it is possible that smartphones will generate holographic displays in the future, I suspect that most of the AR displays we encounter will be generated by physically tethered projection systems.

Today we have wifi routers and cell phone towers everywhere which allow us to connect to the internet if we are willing to pay for it. In the near future, having access to AR displays will be almost as important as having internet access. Every store aisle, traffic intersection, store entrance and public park will have AR projectors. This might seem a bit hard to believe, but the alternatives seem even less likely:

  1. We are all walking around with AR glasses that somehow need to recognize every object in our physical world and create a projection to show us information about that object. In this case, we would still need to have some kind of physical identifying tag on every “smart” object, or alternatively, the computer vision program used by our AR glasses would need to be so incredibly powerful that it could identify an object out of trillions of possibilities in several seconds.
  2. AR doesn’t take off and we still search for information on our surroundings on the internet by manually typing a query into a search engine.

So for now, lets assume that AR projectors, or whatever name they will have in the future, are everywhere that wifi access points are now.

You are walking through a grocery store in the future. There is a holographic display emanating from a projector next to you in the avocado aisle.You do not need a headset to view it.This display shows data that you consider confidential — such as the contents of your fridge at home. You don’t want everyone around you to see what you see. How do you keep that display private? How does that projector know what it is supposed to be showing you?

The problem would be easy to solve if a headset or some kind of advanced contact lenses were involved. But without a filtering mechanism at eye level, it’s much more complex.

One possible solution: think of that holographic display as a website. The analogy might be a bit strained but it explains the mechanism well. Millions of people can be viewing a website at the same time, for example a bank website. Each one of those people will see different information based on the credentials they’ve entered. At an even simpler level, if ten people were reading the landing page of a news site, when one person scrolls down it will not affect the other nine.

Where the analogy breaks down is that if two people are using one computer screen to view the same website they will inevitably see the exact same thing. The website cannot display one article for person A and a different one for person B on the same screen. Up to now, that hasn’t been a problem. Our screens provide some level of privacy in that they can only be viewed from a certain range and for anything beyond that we can buy a privacy filter.

That will change when we have augmented reality projection in mid air without glasses. We will need a method for showing holographic images only to the person that is intended to see them. You wouldn’t want to be walking down the grocery aisle and reading the what’s app messages of every person walking by you. That would be annoying for you, and an unacceptable invasion of privacy for your fellow shoppers.

The answer, as with a web site, might be that you only see what you are entitled to see. Here is another thought experiment that might shed light into this idea: Imagine you are in a national park and you want to learn about a volcano that you have just hiked to the top of. In the olden days you would read a physical board for tourists. You would wait your turn and stand in front of it. In the current world you would open up the app or website with the information, as would all your fellow tourists on their own devices. In the not so distant future, there will be an augmented reality projector where the tourist information display used to be.

That projector will create an image that is 20 feet wide by ten feet tall and curved along the circumference of the volcano’s edge. You will only see the 2ft by 2ft display that belongs to you. That distinction will need to be based on your physical location. In the same way that computers get IP addresses assigned when they log into a network, your eyes or your face will be assigned an IP address that will determine which of the holographic squares you observe.

Now imagine there are one hundred holographic projectors around the periphery of the volcano. As you walk around the holographic square moves with you, or at least it seems to. What is actually happening is that as the next closest projector recognizes you face or iris it projects your specific holographic square at a distance that stays constant as you walk.

You might ask what would happen if five people stood around one projector. Would they all see the same thing? Yes, but not with the same level of clarity. The projector would place their individual squares at the exact distance to their eyes where it would be either completely blurred or imperceptible to the others. Of course, if everyone stood in the exact same place (not sure how?) the system would not work. But in the normal situation where only one human can occupy one physical space it should work.