A Better Work Dispatch from Stoke-on-Trent

We are currently undertaking in-depth, qualitative research with gig workers to better understand their needs and help develop ways for them to manage their money and progress in their careers.

Here are some brief insights from a recent research trip to talk to Uber drivers in Stoke-on-Trent, Staffordshire, about working via a digital platform. We also share a few things we learnt about doing contextual research of this kind.

Image for post
Image for post
Gladstone Pottery Museum, Stoke-on-Trent. Credit: creativetourist.com

Stoke-on-Trent is perhaps best known for its rich heritage of ceramics (we’d highly recommend the Gladstone Pottery Museum), and its cult status amongst sports fans as the spiritual home of “route-one” football. …


Image for post
Image for post
Illustration by James Barclay

Facial recognition technologies promise a wide range of applications, from identity verification to the detection of criminals. But a growing track-record of worrying misuse is leading many to question if the technology should be banned outright.

In the UK its use by the police is being challenged in the courts, with calls for an outright ban on its use, while cases of inaccurate, biased and dubious applications mount.

This post outlines the considerable challenges caused by facial recognition, and gives Doteveryone’s perspective on how to make the technology accountable and responsible.

Read the full briefing paper on facial recognition.

What is facial recognition?

Facial recognition technologies (FRT) identify and categorise people by analysing digital images of their faces. The most advanced forms typically use artificial intelligence to map an individual’s facial features — then compare this map to a database to look for a match, or to categorise the individual on the basis of inferred characteristics such as age, gender, ethnicity or emotion. …


Why we need to take care with making the public aware

Image for post
Image for post
Image credit: Chuttsnap. Amended by James Barclay.

Everyone’s full of advice at the moment about how to navigate the perils of the digital age.

Are you a “youth” looking to navigate the complex world of social media? Facebook have got you covered. Tired of getting scammed online? You better listen to the Home Office’s friendly Guvnor’s advice. Maybe you’re having a hard time spotting “fake news” — here’s the government’s furry green cyborg to illustrate the dangers.

The proliferation of new public awareness campaigns around tech issues is symptomatic of the belief that media literacy will empower the public to control their digital lives and stem the harms which some technological innovations are creating. …


From the culture of basic scientific research through to wellbeing apps — how digital technology is affecting care and connection in universities.

Image for post
Image for post
Photo by Dose Media. Adapted by James Barclay.

A care crisis on campus

The UK’s universities are in the midst of a mental health crisis.

The number of students seeking mental health support rose by 50% between 2012 and 2017, whilst the University of Bristol made headlines for the wrong reasons following the loss of 12 students to suspected suicide in the past three years. Only 17% of students surveyed in 2018 are very happy with their lives, down from 21% since 2016 alone.

The magnitude of the problem has led the university regulator, the Office for Students, to launch a £6 million challenge to generate new approaches to mental health in higher education. …


Image for post
Image for post
Photo by rawpixel. Adapted by James Barclay.

Doteveryone is currently running a project to explore how we can build better and fairer care systems in a future of robots, exoskeletons and smart homes.

Throughout the Better Care Systems project, we’re talking to carers, care professionals, technologists, activists, and people who use care services, (and many who fall under more than one of these too-limiting labels) about the current challenges and asking them to imagine possible new alternative futures of care in which the system is fairer, more sustainable and more effective.

This post explores how we commonly talk about AI, automation and care in the UK, how this is flawed, and imagines ways society and the care community can shape it for the better.


How can we make it easier for people to seek redress in the digital world?

Image for post
Image for post

Earlier this month with Resolver, the free online tool designed to make complaining quick and easy, we hosted the first in a series of three Yes to Redress! meet-ups.

Here’s a summary of the conversation at the first meet-up held on 5 December 2018, and introduction to some of our plans for how to champion and improve redress in the digital world into 2019 and beyond.

Why say Yes to Redress!?

Doteveryone cares about redress for a number of reasons.

For individuals, clear and easy access to redress means they are better able to hold tech companies to account when their digital rights have been breached. …


Image for post
Image for post
Illustration: Elin Matilda Andersson

Earlier in October we published Regulating for Responsible Technology: Capacity, evidence and redress. In the first in a series of four posts, I roughly sketched out the report’s proposal for a new independent regulatory body — the Office for Responsible Technology — to empower regulators, inform the public and policymakers around digital technologies and support people to seek redress from technology-driven harms.

This post describes in greater depth the last of these three functions and explores how the Office can raise standards for complaints handling across the tech sector and enable the public to hold the public and private sector to account for the negative impacts of their digital services.


Image for post
Image for post
Illustration: Elin Matilda Andersson

Earlier in October we published Regulating for Responsible Technology: Capacity, evidence and redress. In the first in a series of four posts, I roughly sketched out our proposal for a new independent regulatory body — the Office for Responsible Technology — to empower regulators, inform the public and policymakers around digital technologies and support people to seek redress from technology-driven harms.

This post explores the second of these functions — how the Office can provide clarity and guidance to policymakers and the public around digital harms and opportunities.

Doteveryone’s vision for an Office for Responsible Technology is founded on a systems approach to regulation. This acknowledges that complex and dynamic digital technologies are beyond the control of one institution alone. …


Image for post
Image for post
Illustration: Elin Matilda Andersson

Last week we published Regulating for Responsible Technology: Capacity, evidence and redress. In the first in a series of four posts, I outlined Doteveryone’s proposals for a new independent regulatory body — the Office for Responsible Technology — to empower regulators, inform the public and policymakers around digital technologies and support people to seek redress from technology-driven harms.

This post describes in greater depth the first of these three functions and explores how the Office can make the UK’s existing regulatory system more resilient, responsive and intelligent.

We see the paper as an important step in an ongoing and lively conversation around digital regulation, and throughout the post I’ve included a few tweets to give you a flavour of the many interesting discussions that have taken place since we published our proposals. To share your thoughts get in touch at @doteveryoneuk or hello@doteveryone.org.uk.


Image for post
Image for post

Today Doteveryone publishes Regulating for Responsible Technology: Capacity, Evidence and Redress. The paper outlines our vision for a new independent regulatory body that will direct digital technologies for the public good.

We recommend establishing an Office for Responsible Technology that will:

  1. Empower regulators with the capacity to hold technology to account. The Office identifies what powers regulators need, supports them to build specialist skills and looks ahead to help them anticipate emerging issues for their sectors.
  2. Inform the public and policymakers so that regulation is founded on an authoritative body of evidence about the benefits and harms of technologies and the public has a source of independent and understandable information. …

About

Jacob Ohrvik-Stott

Research, policy and all-things digital at @doteveryoneuk

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store