This week I attended the 2019 Open Data Institute (ODI) Summit. The theme of the day was: the impact of data. The speakers and panelists used the Summit to explore positive and negative consequences of data in our physical, social, and financial systems. They did this through themes of data trust, fairness, safety, bias, and transparency.
The opening keynote address was delivered by the ODI Chairmen: Sir Nigel Shadbolt and Sir Tim Berners-Lee. The Chairmen led with an impassioned plea for the CEOs of Google and Facebook to issue a moratorium on political ads on their platforms. As two advocates for open data and transparency in data, calls for any restriction to the sharing of information is a serious departure from their typical mantra, that more information is better for everyone. The leaders of the ODI are effectively criticising Facebook and Google as not fit for sharing information; the provenance of information is difficult for us to obtain and the truth of the information shared via these platforms is not verified. This is in response to the upcoming UK elections. The UK’s Communications Act 2003 prohibits political advertisements on TV and radio, but there is no regulation for political ads online.
Sir Nigel and Sir Tim’s alarm on this issue highlighted the gap in top-down governance. Other speakers throughout the day echoed this message. However, speakers frequently suggested that data transparency and fairness were, at least in part, reliant on more robust regulation. This cyclical conversation recognised the current reality (gaps in elected representatives’ understanding of data, data systems, and the consequences of making decisions based on today’s messy data)- with a call for better regulation to rectify the use of data (addressing biases, misuses, and malicious use.) The solution to this challenge requires education and knowledge building across all levels of the public sector as well as learning from past examples of how regulation was applied in new sectors.
In addition to the immature regulation of our digital systems, Sir Berners-Lee offered that we may not be in full control of the information systems we have built, technically. The decisions that private companies, individuals, academics, or public agencies make about how information is- or is not shared should be the result of a considered decision. Constraints in sharing information should not be the consequence of technical architecture in which our data happens to be located. However, many systems that we currently use do not readily support interoperability or interrogatable code. We may have designed systems with unintended, unplanned for consequences. The technical design of our systems makes it challenging to course-correct our systems.
The afternoon keynote speaker, Caroline Criado Perez, picked up on this theme of of unintended consequences through the bias we have embedded into almost everything we make. Indeed, we are living spaces that are probably not designed for us. When people design spaces, services and systems, their personal experiences influence the ultimate form they create. We have a history of men dominating academic and professional spheres. As a result, we are living with the legacy of systemic exclusion of women both as designers and as intended, primary users of any tools, products, or services that exist today. The remedy for centuries of homogeneous design? Criado Perez highlights the need to make data disaggregated, talk about data bias, and expand the diversity of the teams you work with in tech.
The data bias in gender poses an interesting tension for sharing personal data: while we wait for our top-down regulation to catch-up with the wild west of data, we can’t wait any longer to address the biases we are currently embedding in our products (like the recent Apple Card example) or training machine learning systems with.
The speakers and participants of the ODI Summit had the opportunity to share their hopes and fears for the future data and how it is shared and managed. For me, the ODI Summit was a reminder that we are not on a pre-defined, inevitable path into the future. It takes work, but we can decide how to share and regulate information. It is within our power to design a future that we want.
Other references from the day:
- Pick from Sir Tim Berners-Lee: https://whotargets.me/en/
- Kriti Sharma’s organisation, AI for Good: https://www.aiforgood.co.uk/
- A study from the Ada Lovelace Institute examined public opinion on the use of facial recognition technology. And Carly Kind also recommends a moratorium — on companies selling or using facial recognition software.
- The Energy Systems Catapult is looking for input to define best practice for energy data
- Caroline Criado Perez’s book, Invisible Women: Exposing Data Bias in a World Designed for Men, is out now.
- The UK government is developing a national Data Strategy, due in 2020.
- ODI is supporting fantastic data artists, like Alistair Gentry and Mr Gee