Breaking the chain with contact tracing

Stephanie Organ
Nightingale HQ
3 min readNov 4, 2020

--

Years of mistrust

Mistrust in technology has steadily grown over the past decade, but Edelman research shows that through transparency, trust can be restored. Fear currently rules that the very technology proposed to control the spread of a pandemic and free us from lockdowns could also be used for surveillance, which is anything but liberating.

Trust in technology has taken a nosedive over the last 10 years, particularly around processing data. The 2013 Snowden era revealed how governments were violating the privacy of its citizens, information which went on to shape the EU General Data Protection Regulation and which trigger ed a new era of data privacy and transparency.

However, 2016 brought to light the Cambridge Analytica scandal on the US presidential campaign and later, on the UK referendum, showing how data can be used not just to target individuals, but for global manipulation, targeting society as a whole.

The privacy paradox

There has been progress, with the EU just announcing that the Commission is setting up an interoperability gateway service linking national apps across the EU. It will see testing runs between the backend servers of the official apps from Ireland, Italy, Czech Republic, Denmark, Germany and Latvia. The gateway server is developed and set up by T-Systems and SAP and will be operated from the Commission’s data centre in Luxembourg and is expected to be launched in October.

The service follows the agreement by Member States on technical specifications to deliver a European solution to ensure a safe exchange of information between the backends of national contact tracing and warning apps based on a decentralised architecture, something that I will now explore in greater detail.

The Solution: Decentralised AI

It becomes clear how big an issue this is when two of the tech world’s biggest competitors decide to come together to solve the problem. Google and Apple began collaborating to produce an API based on some key principals:

  • Decentralised data storage
  • No mass data collection
  • No location tracking

These were the same factors found in a German study to boost the likelihood of adoption, alongside voluntary use of an app. Transparency is key to giving people confidence. This decentralised approach works by storing randomised key codes from users of the app that come within proximity of each other. If a user reports COVID — 19 symptoms, their key codes are flagged on the server and any matches are alerted.

This system works well because no other information about the users is stored, and codes are regenerated every 10–15 minutes so you can’t easily get an overview of someone’s complete interactions or see exactly where they’ve been. This future proofs it against misuse.

Collaborative machine learning

It works by downloading a current model, learning from local data (i.e. your phone inputs) and sending an encrypted summary to the cloud where it is averaged with other user data and used to update the shared models. This means personal data never leaves your device, but the whole system still learns from your inputs.

Decentralised AI is basically data science without the collection of data. It’s a key step in creating trustworthy AI since it emphasises data security and privacy, and can be applied in a variety of ways, from contact tracing apps to medical diagnosis models and much more.

Earlier this year as part of our #AIFightsBack webinar series, Dr Iain Keaney, Data Scientist and Founder of Skellig.ai , delivered a guest talk on contact tracing apps.

Originally published at https://blog.nightingalehq.ai by Dr Iain Keaney

--

--

Stephanie Organ
Nightingale HQ

I make things with words, and anything else I can find. Current adventures: Digital Marketing.