How can we manage the impact of emerging technologies? How can businesses foresee the impact these technologies bring to people’s lives? How can they stay afloat in this rapidly changing digital landscape? And where is the user in all of this?
These were the questions tackled by the panelists at the first instalment of Futureality, the 2018 MA Innovation Management (MAIM) end of year conference held at the Ustwo studio in Shoreditch.
The panel consisted of Akil Benjamin, Co-Founder & Head of Research at Comuzi, Francesca Cuda, Head of Engineering at Ustwo, Shane Duffy, Head of Digital Experience at Mimecast (and ex-MAIM), and Michael Karliner, Technologist and Co-founder of ThingStudio. The event was opened by MAIM’s Alexander Fefegha and chaired by Megumi Koyama.
Shaping digital futures: advice from the panel
To build for diversity throw out your assumptions and find ways to identify your blind spots.
“People are very complex, we’re all different and from different backgrounds. It’s not just about gender. We need to consider the user in all their complexity.” — Francesca Cuda
Akil: Firstly, don’t make assumptions of the people you’re looking to serve. Whatever you think you know, throw that away. Take time to build personas of who you’re trying to serve — and not just the mainstream but the fringe groups too.
Second, bake that context into the build. Get more people involved, share your ideas, get others to question you. Then you can start asking — who am I missing?
It’s really hard, it can be slow, but it’s part of the process so make sure you map that out before you start. Look at where you might have blind spots. Be more considerate of those who don’t look like you, who you don’t meet everyday — the people who could be using your product but aren’t as even though they’re not there they have value and they can bring that value to the space you’re in.
Be vigilant in this shifting digital landscape. Build with change in mind.
Shane: At Mimecast, because we’re a cyber resilience company, we have no choice but to stay on top of the hackers. We need to constantly innovate, work out how to protect against user personalisation, CEO fraud, all that stuff. And at the same time stay in control of the process around our data. Right now we have to provide API installation to our partners, we have to look at current trends and security issues — how hackers are getting into Sony; how they’re getting their data and using it against them.
Luckily enough, our platform is built for change.
Be socially responsible, establish a code of conduct and sow it into the DNA of your organisation.
Michael: Legislators are still digesting Google, meanwhile start-ups are already thinking things that no regulator has any idea about. Which is why they have to self-regulate. Not just to be ethical, but because if you don’t reality will catch up with you. Survival for start-ups means ethics has to be built in.
“Legislators have no idea how to deal with you. They’re still digesting Google.” — Michael Karliner
And that means from the start. One piece of advice I give to all the start-ups I work with is to, on day one, establish a code of conduct. Make sure that’s sewn into the DNA of the company. There’s no one template to that, you need to learn and discover that for yourself.
Users, subjects and citizens also need to take responsibility for the future of digital.
Francesca: A key thing about GDPR is that it gives the user the right to an explanation. It gives users leverage, and empowers them to ask questions of the services they use. I would love to see more users actually stop and try to find out what’s behind a pretty page or an app.
With developments in machine learning (and in reference to potential future uses of AI and other digital technologies by governments) responsibility lies on both sides of the government/citizen divide. We need to remember that all these fancy things we talk about when we talk about AI and machine learning, are ultimately built by humans. And humans make mistakes. Take Alpha Go as a reference for AI. It’s based on reinforcement learning, which uses datasets. I believe that datasets are the mirror of society, and a ‘bad’ society means ‘bad’ data.
So I think there needs to be more emphasis on creators to do the ‘right’ thing, whatever that may mean. But, again, responsibility also needs to be on subjects/citizens, to question the use of tech.
Throughout the discussion the emphasis was on the need for a shift in the relationship between digital technologies and their users. The panel felt that transactions need to become more honest and transparent — particularly in regard to usage of personal data — and called for more user education from businesses on the implications of ‘liking’ or ‘sharing’ content.
This shift to transparency will be facilitated by the fast approaching European GDPR regulations. However, formal policy alone is not enough. Social responsibility needs to be written into businesses right from their conception, and users in turn need to cease being passive consumers of products, and instead question and actively engage with the digital landscape they find themselves a part of.
This sentiment was extrapolated by the panel’s answer to an audience question concerning the relationship between citizens and their governments. Again, the consensus to this — in light of emerging tech and developments in machine learning in particular — was that the relationship needs to shift, and perhaps already is, into an active and transparent one. How exactly this will or could happen, remains to be seen.
Laurie Atkins is an MA Innovation Management student at Central Saint Martins.