Create a Google Maps Live View-like application with Unity3D in under 10 minutes!

Daniel Fortes
7 min readJan 12, 2022

--

Location-based augmented-reality experiences with routes and navigation in Unity using the “Unity AR+GPS Location” plugin.

A few years ago Google started rolling the experimental augmented-reality view for its Google Maps app. At first it was available in a few cities around the world, but now it is available everywhere and it is called Live View for Google Maps. Apple soon followed suite and introduced the 3D View for their Maps app which provides a similar experience — but only works on a few selected cities.

The Unity AR+GPS Location plugin is an asset for the Unity3D engine which allows you to easily create location-based AR experiences in Unity. With it you can assign geographical coordinates to Game Objects inside the Unity editor, and, after you deploy the application to your device, go to the real-world location, fire up the app, and see your object being placed right on the spot.

In the latest release (v3.6.0) of our Unity AR+GPS Location plugin we are excited bring an exciting feature inspired by the Google Maps Live View and powered by the Mapbox Directions API: the Routes and Navigation feature.

With the new Routes and Navigation feature you will be able to quickly create applications that aid users in navigating through cities, parks, event venues and other places using augmented-reality.

Now I will show you the step-by-step guide on how to prototype such application in a few minutes!

Let’s do it

First open up a new project on the Unity3D editor (we will be using Unity 2020.3, but you can use Unity 2019.4 or newer). Now let’s open up the Package Manager window and install all the packages we need for our application.

On the top-left of the window select Packages: Unity Registry and install the AR Foundation package. If you are building for Android, also install the AR Core XR Plugin, and if you are building for iOS, install the ARKit XR Plugin.

Again on the top-left of the Package Manager window select Packages: My Assets on the drop-down. This will display all the assets owned by you on the Unity Asset Store. Here you should install the AR+GPS Location asset.

The next step is setting up the scene. Go ahead and create a new scene on your project. Just right-click on the Project window and select Create->Scene.

Open your new scene, right-click on the Hirarchy window, and then on AR+GPS->Create Basic Scene Structure to automatically create all the elements of a basic AR+GPS Location scene.

Your scene should like the one in the image bellow. You can delete the GPS Stage Object since we won’t be using it for now.

Next, right click again on the Hirarchy window and select AR+GPS->Mapbox Route.

This will create a new Game Object named Mapbox Route with all the necessary components and basic configuration for building a AR Navigation experience. Still, there are a few things that we need to setup so that things will work properly.

First, paste your Mapbox Token. This it the mapbox API token you can get form your account page on mapbox.com.

Next, go to the Route section of the Mapbox Route component's settings. This section is where you configure the start and ending points of your route.

We want the route to start at the user's location when he opens the app, so we will set the type of the From property to User Location. For the destiation, we will set to Type of the To property to Query. This will show another property called Query — a string that will be used to search for a location using the Mapbox Search API. Here you should write an exact address that will be the first result of the Mapbox Search, e.g., "405 Lexington Ave, New York, NY 10174, United States". You can use this mapbox playground to test Query strings before inputting them.

Of course the locations don't need to be hard-coded on your app. For instance you can create an UI to let users search and select locations themselves — but in this case you need to create UI and scripts, and for this simple example we will stick to hard-coded queries.

Now we are very near the finishing line!

To facilitate in-editor testing, we will add the ARLocationDevCamera prefab into the scene — this allows us visualize our results before building and deploying the application to our device. Go ahead and search for "ARLocationDevCamera" in the project's assets.

Then drag the ARLocationDevCamera prefab to be a child of the AR Session Origin Game Object.

We will also add a Mock Location which is a geo-location we can use for in-editor testing. Create a new Location asset:

And, on the inspector panel, set the Latitude and Longitude to the testing location you want. This should be near the Query string and around the location you will want to run the app in the real world.

Finally, select the AR Location Root game object on the Hierarchy window, and scroll to the AR Location Provider component in the Inspector panel. Drag the location you just created to the Mock Location Data property.

Now we are ready to press the Play Button!

But wait, we get this message instead of our beautiful route! What is happening? Well, it happens that we use Text Mesh Pro for rendering text, and it wants to install some things into our project so it works correctly. Press the Play Button again to stop the Play Mode, and click on Import TMP Essentials.

Wait for the installation to finish and press the Play Button again. Finally you should see something like this!

You can look around using your mouse and move using the WASD keys on your keyboard.

Now we are ready to build the app and deploy it to our device!

Don't forget to go into your Project Settings window and to configure your XR Plug-in Provider. Sing I am building for iOS I will select ARKit. If you are building for android select ARCore.

Open up the Build Settings window, and add your scene to the build.

Go into the Player Settings and set your Company and Product names.

And on Other Settings set platform specific settings. For instance on iOS you must set the Camera and Location Usage Description strings.

Now go ahead, click on Build And Run and deploy your app to your device!

Now go and explore your own Metaverse route!

This is just a small sample of what you can do with the Routes and Navigation feature of the Unity AR+GPS Plugin. With the wave of the Metaverse just coming there are many possibilities for you to develop!

For more detailed information on this feature please go through our documentation. You can also contact us with any questions, issues or suggestions at our support page.

And, if you haven't already, get our asset at the Unity Asset Store.

--

--