Superbad? Facial Recognition in High Schools and Policing

CognitionX
AI Ethics
Published in
2 min readDec 13, 2018

This issue is about surveillance and and the importance of responsible implementation. As surveillance systems proliferate in our schools, governments, cities and homes, we must interrogate why and how these are being rolled out. Under what rules do we manage new AI systems? What safeguards and controls should organisations use when first implementing a new AI tool? How can the system be audited? This week’s articles highlight the potential for unintended bias, security risks, and other negative outcomes from irresponsible roll-outs of AI.

My colleague James Kingston has been hard at work on our upcoming AI Ethics primer, which gives you your ‘need to know’ introduction to the world of ethics in AI. Pre-register your interest in the primer — give him an email on james.kingston@cognitionx.io

Read on to hear more about health tech surveillance, the uses and risks inherent of using computer vision to aid police and high school security, and an essay on the importance of empathy.

Corporate Surveillance

You Snooze, You Lose: Insurers Make the Old Adage Literally True

Interesting article on sleep apnea, the US health insurance system, and corporate surveillance.

Do you know a lot about this subject?

Join our community and become an Expert

Freedom, Privacy

The Trouble With Trusting AI To Interpret Police Body-Cam Video

This article highlights the importance of transparency and accountability in the use of AI — and the social risks in placing undue reliance on yet unproven systems.

Surveillance, Privacy, Bias

Facing Tomorrow’s High-Tech School Surveillance

Where problems with AI arise, it is often not the fault of the ‘AI’ itself. Instead, it is the human processes and systems around it. “As you are describing this to me”, said one researcher regarding new high school surveillance tech, “a whole bunch of red flags are popping up and not one of them is about the machine learning. The machine bias is probably the least fraught question of all these.” Really interesting read.

Surveillance

Facial recognition camera catches top businesswoman “jaywalking” because her face was on a bus

Across China, facial recognition systems are being used to name and shame people walking across the street at the wrong time. An AI picked up a face from an advertisement, misnamed it, and displayed it as a rule-breaker — illustrating the worrying imperfections that can be found in these systems.

Trust

The Empathy Economy

Interesting article on empathy and trust as the key corporate resources of the future in the age of automation and convenience.

--

--

CognitionX
AI Ethics

The most trusted source of personalised advice on All Things AI