Putting the Art into Artificial Intelligence
The intersection of art and data has been a fascinating area of creativity and innovation for decades. Generative art — a concept where part or all of an artwork is created by an autonomous system — can trace its roots back to the 1960s, while the Algorist movement of the 1990s produced a wave of computer-generated art powered by code. This link has remained as advanced analytics and machine learning technology has matured and it’s common for today’s data pracitioners — who frequently find themselves working alongside designers — to nurture a deep appreciation for compelling design.
Art has been an integral part of QuantumBlack since our early days. Beautiful artwork has always hung on the wall of our offices. Alongside an appreciation for art, the combination of bleeding-edge technology with human insight, or hybrid intelligence, has also been a central principle since the very beginning. We recently collaborated with AI artist and researcher Sougwen Chung to help us to explore the concept of hybrid intelligence and Sougwen would go on to produce a two dimensional painting in collaboration with robotic arms — fuelled by data.
Today this painting hangs in QuantumBlack’s London office to showcase the possibilities of hybrid intelligence. In celebration of both, we wanted to share the story behind the artwork. We’ll examine how a structural learning algorithm bridged gaps in decades’ worth of incomplete river data, how human expertise helped create a causal Bayesian Network — and how this Network guided robotic arms into creating brush strokes indistinguishable from those of a human being.
Preparing the Project Palette
We began by questioning which methodologies we could deploy to help fuel the gestures and brush strokes of Sougwen’s robots. It quickly became clear that the best way to do this would be to deploy CausalNex, our open-source library. CausalNex leverages Bayesian Networks, which aren’t inherently causal — however, when combined with the domain expertise of a human, they can highlight how different data points interact with and influence each other.
Of course, the methodology would require data. Alongside Sougwen, we decided to base the artwork around data from a river near the artist’s hometown — a river’s ecology would hold a wealth of varied datasets that would be interdependent on one another to some degree and we were excited to use this natural feedback loop to fuel automated art.
Examining public monitoring information, we were able to collect a series of open datasets that tracked the river’s historic pH levels, the amount of detectable dissolved oxygen in the water, conductivity, riverflow, temperature and the area’s rainfall.
Together we agreed the best approach would be to build a Bayesian model that would understand the relationship between these river measurements, fit conditional probability distributions and observe the effect of potential interventions. We wanted Sougwen to observe how altering one dataset would impact the others: if she changed the rainfall or pH levels, what would happen to the river’s flow and temperature? This would allow her to effectively create and influence her own hypothetical river — and the causal direction of this river would influence the direction of the brush strokes by the robotic arms.
Bridging the Data Gap
We soon encountered a problem. Despite dating back decades in some cases, the river monitoring datasets were wildly incomplete. In order to fill the gaps, we would need to train the model on the little data that was whole in order to understand the structural relationships between data, before using that partially-trained model to predict the missing historic datapoints.
We followed a fairly standard algorithmic approach for learning the Bayesian Network. First we applied a structural learning algorithm to the graph to determine the structure and tried to identify the strongest relationships between variables based solely on the data itself. It’s worth noting that this is not inherently a causal model until domain expertise is added.
In a traditional project we’d have access to an organisation’s expert and deploy CausalNex’s visualisation to highlight to them how variables were changing, so they could then provide industry insight. For example, an expert in the loan underwriting sector could examine the interface, see changes in interest rates and application numbers and explain the cause and effect between the two. Here our team provided our own domain expertise, undertaking research to build our understanding of the relationships between datasets — what happens to pH levels in heavy rainfall, how dissolved oxygen influences water temperature and so on. This knowledge enabled us to adjust the data learned graph in CausalNex until we reached a causal representation of how these variables interacted with each other.
With these relationships understood, we moved to the final stage of training, fitting conditional probability distributions and observed the effect of potential interventions. CausalNex deploys Do-calculus to analyse intervention impact, generating probabilistic formulas for the effect of interventions in terms of the observed probabilities.
Once trained, we were able to use the model to fill in historic data using what it knew about causal relationships between variables. For instance, if we knew pH, temperature and dissolved oxygen levels for one day, the model could predict the statistical likelihood that it had rained the day before or the probability that riverflow would change in the following days.
Bayesian Brush Strokes
We gave the artist a pre-trained model with a hybrid-learned graph structure with all probabilities fitted. This would allow her to conduct Bayesian Network observation and intervention analysis, query the model herself and create her own counterfactual situations.
Sougwen further developed the model, adding her own electroencephalogram readings and linking the causal direction of her hypothetical river to guide her two robotic units. Deploying this mix of outputs with her own AI system, she intertwined her own brush strokes with those from the robotic units to produce a beautiful swirling artwork. With each layer of data she fed the units a different hue, from navy and sapphire to slate.
This was a project quite unlike any other and the finished result embodies our passion for hybrid intelligence: it’s complex, beautiful, data-led and ultimately augmented by human creativity. Following eight years of growth, we’re moving into a new phase as QuantumBlack, AI by McKinsey and while our capabilities and team will continue to grow, our guiding principles remain. We want to attract the world’s best technologists and while we’re excited to see how analytics technology evolves in the coming years, the area we’re most interested in is what happens at the intersection of technology and creative human endeavour.