Visualising Uncertainty using Blender and Power BI

by Mike Simpson, PhD - Research Software Engineer, Newcastle University

--

Visualisations are an important tool for extracting value from data. They can also help us to tell ‘data stories’ in a way that is accessible and easy to understand.

However, many existing visualisations don’t tell the whole story; there is often additional information in a dataset that adds context or meaning to the results. But how do we include this data in the visualisation in a way that the average viewer can understand?

I have been working with Professor Nick Holliman and a number of other colleagues to explore some new ways to do this, focusing on the visualisation of uncertainty. The work has a range of potential applications and has attracted a number of interested partners, including the Alan Turing Institute, who funded the project, as well as the UK Department for Transport, local government, energy companies and even the military.

The main focus of the project has been to develop the visualisations, but we also wanted to make sure that the tools that we developed were as user-friendly as possible. Once the visualisations had been created and tested, we attempted to make them easier to use by integrating them with other applications, specifically Microsoft Power BI.

Visualising Uncertainty

We have already been collaborating with the Urban Observatory at Newcastle University to create tools to visualise urban sensor data. This data includes things like rainfall and air quality, and is collected from a number of sensors located in and around the city of Newcastle.

In order to show the results in context, we displayed the data using a series of icons (or ‘glyphs’), overlaid on a 3D model of the city, as shown below.

Here, the coloured sections of the glyphs (green circles) represent the average value reported by that sensor during the specified time period. The black and white borders around each shape are designed to help differentiate the glyph from the background and to make the colour more readable.

However, we wanted to find a way to add additional data to the visualisation, and we decided to do this by changing the way we used the white shape within the glyph.

The image below is an example of a visualisation that was produced by our tool. The visual complexity of the white shape now represents the uncertainty value. The higher the uncertainty, the more visually complex the shape becomes (i.e., the larger the number of ‘petals’ on the white ‘flower’ shape).

In this case, we were using the variance in the results from each sensor, and a high ‘uncertainty’ value may indicate a problem with that sensor or some sort of local disturbance.

This is information that would not be available to the user if only the average results were shown, as in the first image.

The Problem

The visualisations were developed using Python and Blender, an open-source 3D creation suite. To produce these visualisations, Blender, as well as the Python code and a number of other dependencies, would have to be installed onto the user’s machine. The user would then need to manually run the code that loads the data into the visualisation tool and then retrieve the resulting image(s). They would also require either a graphics card (GPU) or a powerful processor (CPU); otherwise, it can take several minutes to render these images, depending on the complexity of the models and the size of the dataset.

This is not practical for most users, so we wanted to create a more user-friendly approach that did not require the installation of any dependencies or any programming.

The Solution

We decided to investigate whether we could use another tool as a ‘front-end’, in order to make the experience more user-friendly.

We chose to do this using Power BI, which is Microsoft’s Business Intelligence tool that we have used in a number of our other projects. Power BI is capable of producing a range of interactive data visualisations using a simple and familiar drag-and-drop interface.

Step 1: Integration with Power BI (Python)

We created an API that could be given a dataset and then call Blender to render the visualisation and return the resulting image. This was successfully developed and tested using the ‘Python Visual’ option within Power BI.

This made interacting with the visualisation easier, as the user could use Power BI to specify which datasets and columns to use, as well as filter the data to produce different results.

However, several setup steps were still required to use this version of the tool, many of which required programming experience. Plus it was still necessary for the user to have Blender and the dependencies installed, and the rendering was still being done on the local machine.

So, while this was a definite improvement in usability, it didn’t solve all of our problems.

Step 2: Moving to the Cloud (TypeScript)

So, we attempted to reimplement the visual in such a way that Blender didn’t need to run on the user’s machine at all.

First, rather than using the Python Visual feature within Power BI, we decided to implement our tool as a full Custom Visual using TypeScript. This had the added bonus of giving us more customisation options within Power BI, as well as giving us access to additional information (such as how big the visual was on the screen). It also made it easier for the user to import the visual into Power BI without any programming or advanced setup steps (and meant we weren’t wasting resources rendering images that were larger than they needed to be).

The next step was to install Blender, the API and all of the dependencies on a Virtual Machine using Microsoft’s Azure Cloud Platform. This enabled us to do all of the rendering on the cloud, rather than on the user’s machine.

The diagram below shows how the various elements communicate with each other.

The Visual sends the data from Power BI to the Virtual Machine, which runs the necessary Python code and then triggers Blender to render the image. That image is then downloaded to be displayed in the visual in Power BI.

In order to provide a secure connection (as only HTTPS requests are allowed from within the Power BI Environment), an Azure Function was used as a Proxy between the Visual and the Virtual Machine.

The Result

The end result is a visualisation pipeline that is much more user-friendly than our initial tools. With this implementation, the user only needs to install Power BI and import the visual, without any additional setup, as all of the dependencies are located on the cloud. The user is able to use Power BI to set up the data and configure the visualisations in a more user-friendly way, without requiring them to even look at the application code.

While the cloud rendering process can take longer than rendering it locally (it took around 10 seconds via the cloud, compared to 8 seconds on my GPU-enabled desktop), this is a negligible delay. It also means that the image will always render in 10 seconds, regardless of whether the user is using a computer, a tablet or even a phone (assuming that they have a stable internet connection).

More

You can read more about the glyphs in this open-access paper: https://library.imaging.org/ei/articles/36/11/HVEI-206.

You can also watch a video about a project where these glyphs have been used: https://vimeo.com/901318157?share=copy

Conclusion

Sometimes, our role as RSEs is not just to provide solutions, but to find ways to make those solutions as easy as possible to use for our clients and the researchers that we are collaborating with. In this project, we successfully used Power BI to take tools that were developed for use in a lab and make them more accessible to general users.

Thanks for reading!

For more blogs from the Newcastle RSE Team, click here.

--

--

Mike Simpson
Newcastle University Research Software Engineering

Perpetual student, photographer, gamer, aspiring writer, sci-fi addict and code monkey.