Augmented Reality: From Research to Ubiquity in 30 years

A short history of AR explaining why it has taken so long to get here

Ben Vaughan
Scape Technologies
9 min readJul 30, 2019

--

Tom Caudell’s original AR ‘wire harness assembly application’

In 1990, Tom Caudell coined the phrase “augmented reality” to describe a new approach to simplify the way in which Boeing engineers built wiring harnesses.

Little did he know, almost 30 years later, the term “AR” would be the predominant phrase used to describe a new computing paradigm, with the power to transform the way that people work, play and interact with the world.

The very first tentative experiments in the early 1990s clearly demonstrated that the pioneers in the space recognized the potential of the technology and knew where augmented reality could add value in real-world situations. For example, Blair McIntyre and Steve Feiner created the KARMA (Knowledge-based AR for Maintenance Assistance) project; Andrei State presented compelling medical applications demonstrating how surgeons could ‘see’ a fetus within a pregnant patient; and, Jun Rekimoto invented the first hand-held AR display (Navicam) that could detect colour-coded markers and display information on a see-through screen.

Columbia University — Printer Maintenance (KARMA project)

So, the real question is why has it taken 30 years for this transformative technology to get out of research labs and into the hands of consumers?

1990 — Few portable computing devices, no see-through headsets or COTS Tracking

The principal requirements for an effective AR system can be summarized simply:

  • A computer system to generate images
  • A display device to view the images that are generated
  • A tracking and registration system to anchor the imagery in the correct place

On top of these fundamental technologies, it is also desirable that there is a range of affordable tools available to enable developers to create applications quickly and efficiently. Finally, compelling applications are a prerequisite for ensuring adoption, either by Enterprises who are looking at ROI or by consumers, who are hoping for an enjoyable and/or enriching experience.

So, what was the world like 30 years ago and just how easy was it to create, share and enjoy an AR experience?

The short answer is that it was very difficult. There were no tablets, no smartphones and no see-through HMDs, with the exception of a handful of prototype optical and video units developed by large engineering companies. If you wanted a portable transmissive display you had to buy and customize your own.

On the positive side, researchers back then were not restricted to desktop computers only: The first Apple Powerbook 100 was released in 1991, IBM’s Thinkpad appeared the following year and the first PDAs (Personal Digital Assistants) arrived in 1993. It is worth remembering that these early laptops had a tiny fraction of the compute power of today’s smartphones and standardized software development tools did not exist. If you wanted to create an AR app, you would have to buy and customize your own hardware, customize your own tracking system, probably from a magnetic system widely used at the time in VR applications, and write the application code from scratch. Then, of course, there was no easy way to distribute the application and, equally, no way for any potential customers to use it without buying/creating another copy of your own hardware solution.

Not really surprising that the technology stayed in the Lab throughout the 1990s!

2000 — No SmartPhones/Glasses, No AR SDKs, No COTS Tracking & Registration

This state of affairs changed fundamentally in 2001 when ARToolKit was released as an open-source project on the University of Washington website. The SDK featured a 3D tracking library that used high contrast (normally black and white) fiducial markers that could be easily created and printed.

ARToolKit fiducial marker

As the former CEO of ARToolworks, the world’s first commercial AR company and the owner of ARToolKit, the author might be accused of bias in claiming that the decision to operate with a dual-licensing model was one of the key factors in driving the development of AR. However, by maintaining the open-source version of the software, ARToolKit enabled the vast majority of early university research into AR and also allowed the later development of community-driven ports to Maemo, Symbian, Android, C#, Java, Flash and integration with a host of game engines and other software tools.

Early ARToolKit ‘MagicBook’ app with customized Sony HMD

The early years of the 21st century also saw the rapid evolution of cell phones and mobile computing with camera-enabled devices allowing AR experiences to be deployed on low cost, handheld devices such as PDAs, albeit still constrained to pre-prepared, small indoor areas with marker-based tracking and registration.

Notwithstanding these constraints, the number of universities and companies researching AR expanded rapidly through the early 2000s, resulting in George Klein’s release of PTAM (Parallel Tracking and Mapping) in 2007, a new approach to allow camera tracking & mapping in unprepared environments. The following year, Daniel Wagner developed natural feature tracking for mobile phones, which was to become the precursor of the Vuforia software platform. From a consumer point of view, more importantly, was the work done by two Japanese developers — Saquoosha and Nyatla — who ported the open-source version of ARToolKit to C# and then to Flash.

2009: FLARToolKit — consumers see AR in magazines via webcams

For the first time, anyone with a webcam-enabled PC could see augmented reality content delivered via websites or magazines without any requirement to pre-install any software. Suddenly, millions of people around the world were seeing AR for the first time. As a result, the number of available SDKs grew rapidly and the number of developers creating applications (mainly for marketing-related campaigns) ballooned.

2010 — Multiple SDKs, Smartphones, GoogleGlass, small area marker/NFT tracking

Finally then, some 20 years after the phrase was coined, augmented reality has become a broadly recognized technology that can be deployed across desktop, mobile and web platforms and can be used indoors in small areas and viewed through portable devices such as tablets and smartphones.

Coincidentally, 2010 also marks the start of THE CORPORATE ERA. Up until now, very few enterprises had shown any interest in the technology and the sector was dominated by university research departments, creative/marketing agencies, independent developers and the like, mainly working with limited budgets.

Almost overnight, it seemed that investment poured into the industry: Qualcomm acquired Imagination, Intel invested in Layar, Google launched Glass, Facebook acquired Oculus and Magic Leap announced its presence with $600 million of funding. Unsurprisingly, with this level of interest and investment, the technology and tools moved forward too with multiple iterations of mobile phone-based SLAM implementations, the launch of Vuforia as a free-to-use development tool and 3D object tracking.

2015 — Multiple SDKs, See-thru HMDs, indoor Object tracking, Building Scale SLAM-based outdoor AR

Public recognition of AR was given a major boost by Snap’s introduction of filters in 2015 and accelerated massively the following year with Niantic’s launch of Pokemon Go (800m+ downloads). Enterprise involvement in the market showed no sign of slowing down either with Apple acquiring Metaio, Microsoft launching the HoloLens and in 2017, both Apple and Google, opening up free markerless tracking libraries for iOS and Android (ARKit and ARCore).

Pokemon Go — the first mainstream consumer adoption of AR

The last couple of years have been somewhat topsy-turvy for the AR sector. On the one hand, there have been some high profile failures (notably, ODG, Meta and the cancellation of Intel’s $200m Superlight project) but on the other hand, there seems to have been a sea-change in Enterprise adoption of AR with regular reports of very large scale deployments of wearables to aid industrial productivity from companies such as RealWear, Microsoft, Vuzix and Magic Leap.

Moreover, despite recent rumours that Apple has cancelled its SmartGlass development project, many other companies have announced that they are building AR glasses to complement the introduction of 5G networking. These include Huawei, Samsung, Lenovo, Vivo and, most interestingly, DigiLens who, in a partnership with Mitsubishi Chemical, have developed a new plastic material that can be used to create very low cost, low weight waveguides, ideally suited for deployment into the consumer market.

So, halfway through 2019 and we live in a world where consumers are now very familiar with AR technology, Enterprises are moving from Proof-of-Concept to wide-scale deployments and consumer Smart Glasses look much closer to becoming a reality than they did a year ago. Over the last 20 years, AR has moved from marker-based tracking within small indoor environments, to markerless indoor tracking, to building-sized tracking outdoors and, today, to city-wide accurate markerless tracking using a Visual Positioning Service (VPS).

2019 onwards — The next era of spatial computing

Today, Scape Technologies, a London-based computer vision start-up, opens its SDK ‘ScapeKit’ to the public, having been in private beta for the last few months.

To provide a robust, accurate and scalable Visual Positioning Service, Scape has developed a technology pipeline that takes raw camera data collected in large urban areas and creates a 3D map (point-cloud) of that area. Anyone using the VPS can make a single query from a camera device to the cloud, to returns a highly accurate geo-position and orientation for that device.

What does this enable?

By solving the problem of inaccurate positioning from GPS, many of the applications once imagined by AR developers, are now a reality.

For example, today, the company can announce the world’s first MMOAR (Massively Multiplayer Online AR game), enabled by Scape’s Visual Positioning Service.

Holoscape, played in Trafalgar Square, London — developed by Scape intern, Damien Rompapas

It also enables new applications like accurate AR way-finding, visualizing urban points-of-interest, social AR, architecture pre-visualization and historical restoration, as seen in the video below.

By removing the last barrier to enabling city-scale outdoor AR, the launch of Scape’s VPS means that developers can now create massively multiplayer AR experiences across whole cities enabling a new class of application to drive consumer adoption of this transformative technology.

2019 — AR-enabled Smartphones, Smart Glasses, City-wide Outdoor localization and tracking

Scape’s VPS is the first of the company's steps to build the enabling infrastructure for a vast array of new spatial computing services accelerated by the imminent arrival of widespread 5G networking and edge compute, delivering massive bandwidth and extremely low latency.

If we look into the future, we can see a world where people have access to information via glasses, lenses or other mobile devices whenever and wherever they want it, where autonomous vehicles, drones and robots move freely in the environment, understanding where they are, where they are going and what is around them. All of these devices are camera-enabled and Scape’s long term vision is to build the ultimate asset: The Machine-Readable World Map.

If you are keen to learn more about the work we are doing at Scape or what you can do to partner with us, please, visit our website or email us.

Additionally, if you are interested in learning more about our research projects, would like to collaborate, or would like to join the team, reach out to research@scape.io

Ben Vaughan is Head of External Affairs at Scape Technologies, a Computer Vision startup based in London, working on large-scale visual localization.

Follow the company on Twitter.

Interested to learn more about Scape Technologies?

We send a newsletter every couple of months, making sense of the AR industry and sharing our progress.

Sign Up to our newsletter.

--

--