The future of tech for good

Jessica Stacey
Bethnal Green Ventures
5 min readJun 22, 2017

--

Would you hack your body so you could live forever? Some suggest this could be a possibility in the future, but is it useful to live forever and would we be happy? These were some of the issues we delved into at the latest Tech for Good Meetup I helped to run last week, which also saw the launch of the new Tech London Advocates Tech for Good Working Group.

The event aimed to transport the audience 33 years into the future, to explore the impact of technology on our lives in 2050, and dig into issues of ethics in technology and responsible design.

Technology is advancing at such a rate that it’s hard to predict what it will look like in 2050. On the one hand, there is the huge potential for technology to drive social change — to empower, to educate, to tackle climate change, to revolutionise health and mobility and to reduce inequality by enhancing open and democratic processes.

On the flip side, we could be facing a future where technology is developed to benefit the few. Where its’ power lies in the hands of a few big corporates, and where marketing imperatives control the access to technology and what happens to our data and rights. This is a future where, as Georgie Frost put it, “Our use of technology could actually be eroding civil and human rights.”

In 2050 will tech for good win over tech for bad? Here are a few thoughts on the future of technology which I’ve taken from the speakers on the night.

Better data will drive developments in the future

Better data will lead to better understanding, better predictions, better decision making and better services. Take cities as an example, Steve Lorimer from the GLA pointed to current initiatives like the London Data Store, an open data-sharing portal used by citizens and decision makers to better understand the city and develop solutions to problems in London.

But at what cost?

While there is huge potential in using data for social benefit, at the moment the digital world is controlled by a few big tech companies, and we’ve become accustomed to handing over our information to them without paying attention to T&Cs. We’re not in control of the way our data is being used and commoditised for their benefit.

Can the government help protect our data?

Not really, according to Aurelien Simon from Digital Catapult. The panel discussed the development of a universal charter to protect our privacy and data, but Aurelien pointed out that the digital world isn’t powered by governments, so unless you can get the likes of Google to buy into the charter then it just won’t work.

So what’s the alternative? Robb McCargow from PwC said that we need to stop willingly giving our personal data away, and to do this, “We need start a public conversation about handing over data and what it means.”

Enter AI for good

Elliott Engers from Infinity Health said that having better data is great, but unless you can store it, interrogate it and mine it for insights, then it’s useless. This is where advancements in artificial intelligence are making a huge difference. For example, Elliott talked about improvements in the frequency and accuracy of data in healthcare, which, when combined with advancements in machine learning, have made it possible for astonishingly precise diagnosis to be made.

But what about racist robots?

With AlphaGo we’ve seen that machines can rapidly learn how to do some things better than any human can. Machine learning is a means to derive artificial intelligence by discovering patterns in existing data. But what happens when these patterns mirror our existing human biases? Recent research shows that robots can be taught to be racist, sexist and prejudiced by learning from humans.

All you need to do is google images of ‘doctor’ or ‘secretary’ to understand how our human biases are transferred to technology. Ingrid Marsh from BustingBiases talked about the US roll out of ‘predictive policing’ software, which led to over-policing in predominately black areas because of biased data.

Which brings us to the people behind the technology

Technology doesn’t help or hinder problems, people do. Catherine Miller from Doteveryone said, “When we’re talking about technology we’re talking about the people who make the technology and the people who use the technology — so we’re talking about people.”

So what can we, the people, do to ensure a fairer future through the use of technology? These are a few of my takeaways from the event:

Tech for Good is as much about the way tech is made as it is about what it does: These principles over on techforgood.global are a great starting point for thinking about good development of technology. Building on this, I’m really looking forward to seeing the results of the work Doteveryone is doing around an ethical framework for technology.

Stop disrupting and start collaborating: Steve Lorimer suggests that we need to move away from the mentality of disruption and destruction, and move to more collaborative models — where solutions can be designed in collaboration with users and public bodies.

Diversify the workforce making the technology: To tackle those racist robots we need to find ways to diversify the workforce that will be curating the data sets and developing the technology that will determine the future.

Equip society for the impact of technology: We don’t know what technologies our children today will be using in the future, so their education now should be less about specific technologies and more about switching their brain on to cope with whatever is coming down the line.

Shift this conversation out into the wider public: Particularly around data. We need to find ways to translate these ideas in a meaningful way and get people to care about their rights online.

With a fairer framework for the way people develop technology I think the future could look bright. So would I get that chip implant that would help me live forever? Nah, I still don’t think so.

--

--

Jessica Stacey
Bethnal Green Ventures

Product manager interested in startups, tech ethics and responsible approaches to product development