Pranav Agarwal
ACM VIT
Published in
9 min readJan 5, 2019

--

AUGMENTED REALITY: EXPLORED

Augmented Reality (AR) is an interactive experience of a real-world environment, where the objects that reside in the real-world are “augmented” by computer-generated perceptual information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory, and olfactory, among others.

SIMPLE TERMS?

Augmented reality is the technology that expands our physical world, adding layers of digital information onto it, deploying virtual images over real-world objects. AR appears in direct view of an existing environment and adds sounds, videos, graphics to it. A view of the physical real-world environment with superimposed computer-generated images, thus changing the perception of reality, is the AR.

The overlay is executed simultaneously with the input received from a camera or another input device like smart glasses. This superimposition of virtual images over real-world objects creates an illusion that can effectively engage users in a virtual world.

IS IT DIFFERENT FROM VIRTUAL REALITY?

Unlike virtual reality, which creates a totally artificial environment, augmented reality uses the existing environment and overlays new information on top of it. Virtual reality means computer-generated environments for you to interact with, and being immersed in. Augmented reality, adds to the reality you would ordinarily see rather than replace it.

Image result for augmented reality virtual reality

Augmented reality is actually a mixture of real life and virtual reality, somewhere in between the two, so it’s often referred to as mixed reality.

EVOLUTION OF AR

AR in the 1960s:

In 1968, Ivan Sutherland and Bob Sproull created a first head-mounted display, they called it The Sword of Damocles. It was a rough device that displayed primitive computer graphics.

AR in the 1970s:

In 1975, Myron Krueger created Videoplace — an artificial reality laboratory. The scientist envisioned the interaction with digital objects, using human movements. This concept was later used for certain projectors, video cameras, and onscreen silhouettes.

AR in the 1980s:

In 1980, Steve Mann developed a first portable computer called EyeTap, designed to be worn in front of the eye. It recorded the scene to superimposed effects on it, and show it all to a user who could also play with it via head movements. In 1987, Douglas George and Robert Morris developed the prototype of a heads-up display (HUD). It displayed astronomical data over the real sky.

AR in the 1990s:

The year 1990 marked the birth of the “augmented reality” term. It first appeared in the work of Thomas Caudell and David Mizell — Boeing company researchers. In 1992, Louis Rosenberg of the US Air Force created the AR system called “Virtual Fixtures”. In 1999, a group of scientists, led by Frank Delgado and Mike Abernathy, tested a new navigation software, which generated runways and streets using data from a helicopter video.

AR in the 2000s:

In 2000, a Japanese scientist Hirokazu Kato developed and published ARToolKit — an open-source SDK. It was later adjusted to work with Adobe. In 2004, Trimble Navigation presented an outdoor helmet-mounted AR system. In 2008, Wikitude made the AR Travel Guide for Android mobile devices.

AR today:

In 2013, Google beta tested the Google Glass — with internet connection via Bluetooth. In 2015, Microsoft presented two brand new technologies: Windows Holographic and HoloLens (an AR goggles with lots of sensors to display HD holograms).

AR FOR THE FUTURE?

As we all make forward advances in the digital revolution, “The Matrix” makes us question the difference between fiction and reality. That’s in part because hardware engineers and software developers continue to refine their augmented reality technologies. Augmented reality (AR) may soon be the true reality for us all.

By June 2017, 336 startups based on Augmented Reality have been listed on Angelist and its user base is expected to grow up to 1 billion by 2020.

Phones and tablets and glasses will not be the only venue for AR. For example, the development of augmented reality earbuds, which allow you to adjust the sounds that come in from your surroundings. Research continues apace on including AR functionality in contact lenses and other wearable and self-operated devices.

The ultimate goal of augmented reality is to create a convenient and natural immersion, so there’s a sense that phones and tablets will get replaced, though it isn’t clear what those replacements will be. Even glasses might take on a new form, as “smart glasses” are developed for blind people.

WORKING OF AR

The Augmented Reality technology can work using one of the following approaches:

1. SLAM:

SLAM (Simultaneous Localization and Mapping) is the most effective way to render virtual images over real-world objects. SLAM simultaneously localizes sensors with respect to their surroundings, while at the same time mapping the structure of the environment.

SLAM is an approach to solve complex AR simulation problems and is not any specific algorithm or software. The SLAM system is a set of algorithms aimed at solving simultaneous localization and mapping problem. This can be done in multiple ways and now every augmented reality development kit has its upon to providing SLAM functionality.

2. Recognition based:

Recognition (or marker) based augmented reality uses a camera to identify visual markers or objects, such as a QR/2D code or natural feature tracking (NFT) markers, to showcase an overlay only when the marker is sensed by the device. Marker-based AR technology depends upon device camera to distinguish a marker from other real-world objects.

Not only the marker image but the position and orientation can also be calculated. Once recognized the marker on screen is replaced with a virtual 3D version of the corresponding object. This is done to permit the user to observe the object in more detail and also from various angles. Rotating the marker would rotate the virtual replication as well.

3. Location Based:

Contrary to recognition based AR, location-based AR relies on a GPS, digital compass, velocity meter, or accelerometer to provide locational data and the augmented reality visualizations are activated based on these inputs. It is also known as marker-less augmented reality. The location detection features in smartphones make it easy to leverage this type of augmented reality technology, making it quite popular. Some common uses of location-based AR include mapping directions, finding nearby services, and other location-centric mobile apps.

4. Projection-based AR:

Projecting synthetic light to physical surfaces, and in some cases allowing the user to interact with it. These are the holograms we have all seen in sci-fi movies like Star Wars. It detects user interaction with a projection by its alterations.

5. Superimposition-based AR:

Replaces the original view with an augmented, fully or partially. Object recognition plays a key role, without it the whole concept is simply impossible. We’ve all seen the example of superimposed augmented reality in IKEA Catalog app, that allows users to place virtual items of their furniture catalog in their rooms.

COMPONENTS INVOLVED:

Cameras and sensors:

Collecting data about user’s interactions and sending it for processing. Cameras on devices scan their surroundings and with this information, a device locates physical objects and generates 3D models. It may be special duty cameras, like in the Microsoft HoloLens, or common smartphone cameras to take pictures/videos.

Processing:

AR devices eventually should act like little computers, something modern smartphones already do. In the same manner, they require a CPU, a GPU, flash memory, RAM, Bluetooth/WiFi, a GPS, etc. to be able to measure speed, angle, direction, orientation in space, and so on.

Projection:

This refers to a miniature projector on AR headsets, which takes data from sensors and projects digital content (result of processing) onto a surface to view. In fact, the use of projections in AR has not been fully invented yet to use it in commercial products or services.

Reflection:

Some AR devices have mirrors to assist the human eye to view virtual images. Some have an “array of small curved mirrors” and some have a double-sided mirror to reflect light to a camera and to a user’s eye. The goal of such reflection paths is to perform a proper image alignment.

AR DEVICES:

Mobile devices (smartphones and tablets)

The most available and best fit for AR mobile apps, ranging from pure gaming and entertainment to business analytics, sports, and social networking.

Special AR devices, designed primarily and solely for augmented reality experiences.

One example is head-up displays (HUD), sending data to a transparent display directly into user’s view. Originally introduced to train military fighters pilots, now such devices have applications in aviation, automotive industry, manufacturing, sports, etc.

AR glasses (or smart glasses)

Google Glasses, Meta 2 Glasses, Laster See-Thru and Laforge AR eyewear are some smart glasses which are capable of displaying notifications from your smartphone, assisting assembly line workers and accessing content hands-free, along with many other applications.

AR contact lenses (or smart lenses)

Taking Augmented Reality a step further, manufacturers like Samsung and Sony have announced the development of AR lenses. Respectively, Samsung is working on lenses as the accessory to smartphones, while Sony is designing lenses as separate AR devices (with features such as taking photos or storing data).

Virtual retinal displays (VRD)

Creating images by projecting light beams into the human eye. Aiming at bright, high contrast and high-resolution images, such systems yet remain to be made for a practical use.

EXISTING AR APPLICATIONS

Google SkyMap is a well-known AR app. It overlays information about constellations, planets and more as you point the camera of your smartphone or tablet toward the heavens.

Wikitude is an app that looks up information about a landmark or object by your simply pointing at it using your smartphone’s camera.

The IKEA Place app provides an overlay of a new couch for that space before you buy it so that you can make sure it fits.

Augmented reality home decorating apps like the Dulux Visualizer allow you to look through your smartphone camera and change the colors of your walls at will.

Inkhunter is a tattoo app that works similar magic on your skin, allowing you to check what a new design would look like on any part of your body before going anywhere near a needle!

There are fascinating business benefits, too. The Gatwick passenger app, for example, helps travelers navigate the insanity of a packed airport using its AR app.

But AR is more than just smartphone fun. It’s a technology that finds uses in more serious matters, from warfare to medicine to education.

The U.S. Army, for example, uses AR tools to create digitally enhanced training missions for soldiers. It’s become such a prevalent concept that the army’s given one program an official name, Synthetic Training Environment, or STE. Wearable AR glasses and headsets may well help futuristic armies process data overload at incredible speeds, helping commanders make better battlefield decisions on the fly.

Doctors use AR both for teaching (medical students can use AR apps to explore organs in 3D, then project them onto their own bodies) and for previewing to patients how certain treatments might affect them; a company called Orca MD makes several “decide” apps (Eyedecide, Knee Decide, and Foot Decide to name just three) that allow people to explore what bits of their body look like from the inside when they’re suffering from any ailment, and explore a variety of medical conditions.

Classroom education’s another area where reality is well worth augmenting. Sun Seeker and Star Walk are just two of many AR applications giving students an inspiring new method to help find their way around the world. Satellites and passing flights can be tracked the same way; check out the app version of Flight Radar for example.

The possibilities of AR tech are limitless. The only uncertainty is how smoothly, and quickly, developers will integrate these capabilities into devices that we’ll use on a daily basis.

--

--