“Samantha”

From here to “her”

Craig Walmsley
Mar 18 · 9 min read

How AI will transform digital design

In the 2013 Spike Jonze film “her”, Joaquin Phoenix plays a lonely writer who falls in love with his operating system, “Samantha”, voiced by Scarlett Johansson.

This looks kind of sad, but it was a really good movie, I promise.

It was a great film, and like all good Science Fiction, was just plausible enough to be believable, and just distant enough to seem magical.

Well, a lot has happened since the movie came out, not least, the arrival of voice technologies like Siri and Alexa.

At the moment, these devices deliver very simple experiences, taking simple instructions in pre-defined processes that are designed to cater to everyone in the same way.

But to deliver the kinds of experiences outlined in “her”, they’ll have to deliver experiences that are truly “conversational”, adapting to the needs and manner of the people they’re speaking with.

But how can you create an experience that adapts itself to the person using it?

A combination of automated A/B testing, advanced customer segmentation, and AI-driven design all point toward a future where you can personalise not just content, or even the order of its presentation, but the very nature of the interaction itself.

This might not make you fall in love with your computer, but it should make a computer that’s talking specifically to you.

Data-Driven Content

Netflix already tailors each member’s home screen to their interests, while simultaneously exposing the depth and breadth of their content catalogue.

All of the movies that they recommend to you when you visit all driven by your past viewing habits as interpreted, scored and measured by multi-variable algorithms.

A single show can present multiple themes, and different aspects of the show might appeal to different people. Presenting the same show with a single visual representation obscures this complexity and narrows its appeal, so Netflix presents multiple representations of the same show to highlight different aspects of it.

Using AI Netflix presents different promos for the same show to different people delivering a measurable improvement in the engagement with their shows. What content is presented to you, and how it is presented — all of that is driven by machine learning.

Netflix artwork for ‘Stranger Things’, covering a breadth of content themes, delivered to different users according to their personalisation algorithm.

This machine learning is powered by one simple mechanism — A/B testing — you put different options in front of customers, measure which is most successful, and go with that. Netflix is an exceptionally sophisticated practitioner of the art, but they are just one high profile example of this sort of data and machine learning driven personalisation.

Google is machine-optimising search ad creative. Facebook has created Dynamic Creative Ads, which test multiple different ad elements against different variable to present the optimal combination of ad elements to any given individual.

Which has one significant consequence.

People might select what content is present in the catalogue, and they might even make an initial decision about how this content could be represented by different visuals, but what content is displayed to any given user, and how it is displayedthat’s up to the machine to determine — people play no further role.

Designing Digital

That has implications for not just individual creative elements, but for design more generally.

A/B is widely applied to interface design.

The placement of a button, its size, its colour; the layout of the navigation; the order of content on a page; the width of the page measure; the sequence of questions; the steps in a checkout process — all of these are routine interface design decisions which, in many organisations, are no longer solely made by humans, but by the data derived from A/B tests.

This is your typical A/B Test — try a couple of variants, select the one that works best for most people.

People spot areas of possible improvement, create new options, test them, and select those which works most effectively, changing the design over time in an interaction between human and machine-driven testing.

In many results-focused organisations, A/B testing effectively makes design decisions.

Data-Driven Design

Other forms of data-driven design validation are emerging.

For example, the company Eyequant, uses AI to assess and validate design.

It took the responses of 500 people to train its AI algorithm, and can use that to score design work — analysing work and driving design decisions based on an assessment of the most effective design choices.

User Data + Machine Learning = AI-Driven Design Assessment

And once you have machines that can determine what good design looks like, you can create machines that generate this design for you — with companies like WiX, Firedrop, and Adobe all creating services that have AI design a site’s interface without the need for human intervention.

By analysing a wide range of possible designs, machines can now construct possible design solutions for any given experience.

(This possibility isn’t confined to digital interface development — MIT researchers are already developing AI industrial design.)

Machine Design + Automated A/B Testing = Machine Design Evolution

This opens up the possibility that large parts of an on-going design process could be driven by machine, without the need for people:

  • Take an initial design, and create a series of measures for that process
  • Allow AI to iterate different options based on that initial design
  • A/B test the different options
  • Let the machine pick the one that most effectively meets the measures set
  • Automatically release the new option to optimise the experience
Here the AI creates interface variations and then A/B tests them, to determine what works best.

The example above uses a simple button colour as an illustration, but the sample principle could apply to any aspect of the design process — size, shape, colour and placement of elements, the sequence of a series of steps in a process, the amount of content, the type and format of images.

And this doesn’t have to be a one off exercise.

You can keep doing this over time, creating new variations building upon your previous improvements and evolving the experience over time as the machine keeps making incremental changes.

Here the AI takes the winning design from the previous test, and iterates on it again, tests it, selects the winner, and keeps doing it from now until the end of time!

Here, AI, creating small changes in designs, then A/B testing them at scale against a specific goal, could incrementally evolve the design of a given interface over time into an entirely new experience more fit to meet that goal.

Indeed, some companies are already working on how to translate AI-driven design into practice using “evolutionary algorithms”.

Machine Design Evolution + Personalisation = Divergent Design Evolution

Which is all pretty cool, interesting and a little bit weird.

But things can get cooler, more interesting, and weirder, still.

We’ve assumed that the machine selects a single design solution for a single process for a single audience.

But we know that personalisation can make a huge difference to response rates in content — people like different things and respond differently to different stimulus.

There’s no reason to suppose interface design is any different.

Take the very first example above— there, the orange button was overall more effective, because a larger segment of the audience responded to it, so on aggregate that’s the most effective choice.

But a large sub-set of the audience responded better to the purple button — so, if you can segment your audience and serve different designs to different audiences, you don’t need to choose just one option , you can serve the option that’s most effective for each audience.

The A/B test doesn’t rule one design out, it identifies which audiences like which design.

Here the orange audience responds better to the orange button, and the purple audience to the purple button — so we can serve two different versions of the site design to these two different segments — creating a much more effective experience overall.

But as above, you don’t have to do this just one.

The AI can keep on trying out new variations — and as it iterates over time, you could end up with entirely different site designs serving entirely different audiences, each evolving and diverging over time to create an experience best suited to its users.

The AI keeps iterating the designs for each audience, and their different preferences drive divergent design outcomes.

One version of an ecommerce website might be optimised for repeat buyers, another for highly visual individuals, for older users, for searchers, or for those who sort by size.

A site might have one version for men and another for women (if they engage with content in different ways).

But more likely, you will create experiences that are different in ways and for reasons that are entirely unexpected — the machine simply does what works and taps into ways of thinking, consuming, interacting and buying that work, but which might be wholly unconscious, unspoken and previously unknown.

So, as well as personalising the content to the person, the digital experience itself will be tailored to the user’s preferences.

That might sounds a little bit odd — the same basic experience will look, feel and behave differently for different people.

A Winning Personality

Which sounds complex and challenging for visual interfaces, but absolutely perfect for voice.

Right now voice experiences are in their infancy—as with most other digital channels, the experience architecture for voice is “one-size-fits-all” — every user gets the same basic experience.

But voice is a conversational medium, and conversation is a medium with some unique features:

  • It requires a mutual exchange of information
  • This mutual exchange requires mutual adaptation
  • This adaptation necessitates a tone, manner & cadence suited to the individuals involved

Some consumers will want to give and take short sharp instruction and action. Some will want careful explanation in a reassuring tone. Some will want to tackle a problem by digressing into related subjects for a holistic view. Some will just want the highlights.

In voice, you want to create experiences that are adapted in tone, manner and cadence to the individual being addressed — you want the experience architecture to be personal to the participants.

Which is exactly the kind of thing created by the “evolutionary design” outlined above — the machine adapts itself to different conversational styles and reflects those styles in its own mode of engagement.

A conversational architecture is machine evolved over time for different audiences — one simpler and more direct, the other more involved and exploratory.

One simple way to express these different voice experiences is to draw an analogy from people — these different experiences will engage with people in different ways — simply put, they will have different “personalities”.

One might be direct and business-like, one more chatty and conversational, one much slower and explanatory — all depending on the application, the need, the audience and the evolutionary AI-driven design.

Evolutionary Design

We’ve long been used to the idea that different platforms and devices require different interfaces and experiences.

The same service is available through desktop browser, mobile browser, mobile app, TV app, and now voice — each time re-worked to deliver the optimal experience for the medium.

But we’re soon going to have to get used to the idea that digital experiences on every platform are designed by machines, that these experiences will evolve over time, and that there won’t be one “one-size fits all” experience design — there will be as many versions of an experience as there are customers.

In voice, we’ll expect that services adapt themselves to the person they’re speaking to—and we’ll call that the “personality” of the experience — when you talk to your computer, your computer will present the version of itself that’s best suited to you.

Which might not make your computer fall in love with you, but will definitely make the computer more responsive and adaptable to the way you like to do things.

And that is pretty loveable.

It also opens up a huge range of questions about how you manage a brand and an experience when that experience is as diverse as each of its users — but that is the subject of an entirely different article.

Watch the Webcast:

TL;DR?

I know the feeling!

So you can watch some slides and hear me talk about all of this here, if that’s your kind of thing:

A Webcast for the Design Management Institute, 13 March 2019.

With thanks to Adam Hardy, Blake Di Cosola III, and Hannah Jones for their help in the development, writing and presentation of this article.

Craig Walmsley

Written by

Founder @ rtobjects. Locke Scholar.