How to build an AR prototype in Unity
Just before this summer, our company Vragments and the Technologiestiftung Berlin, a non-profit foundation from Berlin, teamed up to explore the possibilities of augmenting real city models with geospatial visualizations. It just so happens, that the Berlin Senate Department for City Development is housing an exhibition of several Berlin city models in 1:500 and 1:1000 scale. You can take a virtual tour on their home page, if you like. Our efforts brought to life an AR prototype, built using Unity3D and their native ARKit and ARCore plugins that you can download here for 👉 iOS and Android 👈. Currently, it is in German, but we may consider adding an English version (if you are interested, please get in touch with me).
Code is also on GitHub, since this is an open source project. Feel free to check it out, play around with it and do your own cool stuff with geographic data.
Types of visualization
Three questions we had to ask ourselves before getting started:
- what open data was available at that time,
- what has been done before in “traditional” data visualization,
- and what would actually fit the bounding box of the real-reality (RR) vertical model.
We decided to investigate three main types of visualization on top of the model — areas, lines and points of interest. So we were looking for some interesting data-sets to use.
Getting data — area, lines and points of interest
The model’s bounding box is mostly inner city Berlin. What kind of data is available?
The most fine-grained set for areas would be data on a building block level, as demonstrated in this visualization project of broadband internet access (German). Another option is to look for data-sets based on an administrative area pattern of so called living-oriented spaces in Berlin’s open-data portal. The amount of data available was rather underwhelming and we ended up using a measure of city density taking the relation of floors per building to the ground area into account.
What about lines? A recent project “Radmesser” published by Der Tagesspiegel and funded by MIZ Medieninnovationszentrum Babelsberg evaluated the availability and condition of bike lanes in Berlin. That should make for a good visualization.
A second set of line data was coming from a data project on urban bike mobility. The data-set contains starting and ending timestamps of a bike trip along with a set of coordinates going from one station to another over the course of 24 hours. It was a total of 1505 bike trips to be animated across the physical board, so already an interesting performance question regarding update cycles, frame rates, etc. on a mobile, while keeping track of the environment and such.
Points of interest
A third visualization type was to focus on points of interest. The Berlin Wall had such an impact on the city and its development for decades that it was our main choice to be added as an augmentation onto the model. We picked three iconic sites as points of interest and visualized them through a set of pictures and a description.
Show, don't tell
Let's start with the area visualization. This is the source data map of building density. We can see there is enough data variation that should make for good information visuals.
We took this to the test, created a mesh for the data-set and started designing some appropriate parameterized shaders. This part was mainly done by our 3D expert Jens Brandenburg. See some experiments in the following screenshots.
Our gradient shader with bottom and top colors extrudes on the z-depth value of each administrative sector. The more densely populated a sector is, the higher and greener it extrudes in this example. The biggest problem was that this shader was world-locked and not very easy to handle in an AR session, where the scene’s coordinate root is locked to the device when starting the app and would have to be adjusted to match the mesh coordinates after initialization. This seemed too inflexible for a prototype.
Another weakness was the transparency. While it is useful to still see the physical model, at certain angles one could not really observe the extrusion very well and make sense of the data. Yet, for now, we observed that a single color works better overall combined with extrusion to add depth to the model. The gradient variant could work nicely with an added legend and on a per sector base, leaving all other sectors either hidden or faded out considerably.
Our final result was using some properties of our gradient shader, like the more opaque top surface.
Ride the streets of Berlin
Moving on to our line visualization. We decided to use two data-sets that were equally interesting.
Our first data-set was a filtered GeoJSON file from the original data of the project “Radmesser” that only contained data from our target bounding box. Here’s an excerpt of a feature from that feature collection containing a multi-line geometry property. We were most interested in the key value (“STRSCHL”: “1605”), the coverage (“ABDECKUNG”: 0.0), and the coordinates of course.
[ ... OMMITTED FOR READABILITY ... ]
[ ... OMMITTED FOR READABILITY ... ]
[ 13.3192, 52.508958662336752 ],
[ 13.319448764861379, 52.508962989456393 ],
[ 13.319554453283571, 52.508952001315976 ],
[ 13.319803551156319, 52.508937697178176 ],
[ 13.320032190971109, 52.508923102030579 ],
[ 13.320316359267492, 52.508909294780977 ],
[ 13.322634081280583, 52.508785167882053 ],
[ 13.32290629904543, 52.508766556885412 ],
[ 13.32500968897619, 52.508669822693989 ],
[ 13.325171474800982, 52.508686845616928 ]
So how do we get this to map in a Unity3D scene? And if we do, how can we make sure it matches the scale of 1:1000 of the physical model? Use some simple vector math! Admittedly, we’re working here on an idealized model of a perfectly rounded world, which in reality is not. Our planet is more or less a giant rocky potato 🥔 in space. So our approach is relatively simple. We take the coordinates values for latitude and longitude to calculate a Quaternion and project a visual element at 6371 (roughly our 1:1000 scaled model Earth) units distance from the point of origin. The code snippet for that is:
Vector3 coord = Quaternion.AngleAxis(coords[i].longitude, -Vector3.up) * Quaternion.AngleAxis(-coords[i].latitude, -Vector3.right) * new Vector3(0, 0, -6371.0f);
To make it more visual, take a look at the image below. Unity is a left-handed coordinate system. The Vector3.up (Y axis) is the green arrow in the image below. If we add to it, we get a turn clockwise around it’s axis. Since Berlin is in the Eastern Hemisphere, we need to deduct the longitude value to get a counterclockwise rotation. For latitude, we manipulate the X axis (Vector3.right), or the red arrow in the below image where we want a clockwise rotation. Finally just move 6371 units into the vector direction and draw a point, a line, or whatever object you’d like.
When we’re done with this for each feature in the collection, and each multi-line object within a feature and for each point of a multi-line, then we get:
So, how do we get it to appear somewhere close to where we need it — ideally in front of our AR camera or near an object that we can project our data onto? Simply rotate the world origin to an offset value that matches the coordinates of inner city Berlin. I used the Brandenburg Gate coordinates for that, since it is an iconic place that can easily be spotted. Additionally, I introduced an anchor element to be able to further offset and tilt the whole visualization which is especially useful for our marker based use case in the exhibition.
To add a little more realism to our testing, we added a digital remake of the wall model (again credits to Jens), created also from openly available 3D data. I will not talk about the effort of cleaning this up, reducing vertex count and so on. It took quite some time to turn it into something that would not immediately freeze on a regular smartphone. What regular smartphone does AR run on anyways? Our older Samsung S7s could run it quite smoothly, so we were happy with that.
Our second line based data-set was about the data project on urban bike mobility. You can see all bike stations and trips during June 3rd, 2015 in the picture below. So we already combined lines with points of interest in here.
The challenge now was to load each route and animate it along a given line. Here's how we did it (Details can be found in BikeRoutesRenderer.cs):
- Our datatype that we want to store all routes in is a sorted dictionary of starting times and a list of objects (all bike tours that started at that particular timestamp)
// Array of all routes with key=start time and bike GameObject
private SortedDictionary<DateTime, List<GameObject>> routesDict;
- Load the JSON and create the routes for each feature object
- Add all nodes of the route to a list of vectors
- If the timestamp exists, add this route to the list of our dictionary entry. If not, create a new one
// add current route to sorted dictionary
if (routesDict.TryGetValue(fromTime, out goList))
List<GameObject> goList = new List<GameObject>();
- Once everything is loaded, we can start the animation cycle with a speedup factor (so we don't spend an actual day)
- The game object loaded contains a Bike component that does the actual animation along the route
foreach (KeyValuePair<DateTime, List<GameObject>> route in routesDict)
// start with current time of 2015-06-03 00:00:00
while (currentTime < route.Key)
currentTime = currentTime.AddSeconds(Time.deltaTime * SPEEDUP_FACTOR);
time.text = currentTime.Hour.ToString("00") + ":" + currentTime.Minute.ToString("00");
yield return null; // to interrupt the coroutine
foreach (GameObject go in route.Value)
Bike compBike = go.GetComponent<Bike>();
count.text = currentCount++.ToString();
Tear down this Wall
A third visualization type was to focus on points of interest of the Berlin Wall. As the sites, we picked
- the East-Side-Gallery
- Checkpoint Charlie
- the Berlin Wall Memorial site (Bernauer Street)
We used CC-licensed images from Wikipedia to show users specific information when they come closer to these POIs. We also added a nice little description to each image and POI. This is how it looks like on the real physical model.
What did we learn?
We believe this is an innovative project, since it allowed us to test how augmenting data visualizations onto physical 3D models can help future city development. It is a valuable tool for planners and citizens alike, making sure that decisions can be made and communicated transparently.
Three examples where we see this in the future:
- How road planning affects communities (see our VR project A100, about a Berlin highway extension project)
- Allowing citizens to make informed decisions about which community/area they want to live in based on area specific open data
- Displaying real-time sensor data to discover critical issues regarding environment and health (air/noise pollution, water quality, etc.)