Genomics data at risk, again

Sarah Gold
Writing by IF
Published in
3 min readOct 21, 2024

The latest report of another alleged break in trust relating to UK Biobank is deeply worrying.

If leadership and government don’t change their approach, this will happen again. This latest report should make organisations rethink how they allow researchers to access datasets, especially given the UK government’s likely ambition to give AI companies researcher access.

Whilst we await the technical details of the trust break from UK Biobank, there are other stories across the private and public sectors that provide examples.

23andMe, a private ancestry company, are in trouble

Last year 23andMe reported a second breach. Hackers stole ancestry data, enabled by poor data practices that failed to recognise or act on the relationship between individuals and their relatives. 6.9 million accounts were breached, which accounted for 50% of users. Countless relatives of those users were also affected. There were equity issues, the increased risk of discrimination and harassment, and financial loss. This led to $30 million in settlements, decline in stock value by 99%, a full board resignation and mass opt-outs.

The stakes are high

Organisations holding big datasets about people need to start following a new playbook. Organisations tend to rely on an old playbook of consent, de-identified data and engagement practices. Necessary, but woefully inadequate today.

There is a big gap in expertise and curiosity towards technology, and without significant change it is a matter of time before there is another crisis. Those who will suffer the most are participants, the people who generously consent to information about them being used for research.

This status quo poses a significant risk to the wider sector. How many more trust breaches will the UK public accept before mass opt-out becomes the norm?

Our ability to trust organisations is critical

Finding new ways to prevent, detect and treat disease through genomics data is only possible if people trust that data about them is being stored and used in ways that they expect, and agree to. Participants need to trust the genomics organisation to join and remain with the programme. Researchers need to trust the organisation to use its capabilities in their research on diseases.

The tools and practices for building trustworthy data products are lacking

Today, collecting, combining and reusing data is getting easier, as is sharing data between organisations. That’s before we start applying AI.

Whilst Trusted Research Environments are emerging, there is a lot of work to do yet to make them worth trusting. To help people feel safe, respected and empowered.

Illustration of fragmented health data being organised in verifiable data logs.
Work with DeepMind Health to help solve their trust challenges.

Demonstrating trustworthiness is not only about good governance, or better audits. It’s about the legibility of the underlying systems being used, and that being experienced. Experienced by people using the service, and people coming into contact with the system more broadly. It’s an urgent infrastructure and human problem to be solved. With a new playbook.

UK Biobank now has a choice

Risk another catastrophe? Or be the exemplar of change?

Projects by IF help organisations take the leap from experience to trust, meet the challenges already here, and build better futures.

  • For transparency — this blog post was edited on Monday 21st October at the request of UK Biobank, to clarify that the trust break is alleged.

--

--

Writing by IF
Writing by IF

Published in Writing by IF

Trust is the new experience, and it’s for all of us to design. Follow for blog posts on design, technology and trust by the team at IF.

Sarah Gold
Sarah Gold

Written by Sarah Gold

Designing for trust. Founding partner and CEO @projectsbyif

Responses (1)