Fast.ai Practical Data Ethics lesson 4 notes (continued)-Privacy and surveillance: towards solution

Surveillance can cause a lot of harm to the most vulnerable. But we can improve the situation.

Risto Hinno
6 min readSep 21, 2020

Lesson 4 (second part) materials are here. Lesson 4 first part material are here. Notes for first part could be found here. These are my notes and thus not complete and may contain mistakes.

Surveillance concerns

Surveillance is a tool which is used to observe the “others”. Those others might be minorities who are “different”. When they organize we fear that they might gain power and use surveillance to keep them quiet. Probably there is a decent amount of surveillance which is needed for security but many agree that nowadays level of surveillance is much higher than needed for providing security. Even if surveillance would work, it would still create concerns and harms to society.

Bad when (surveillance) system doesn’t work

  • Little evidence that surveillance products (for example facial recognition software) work (at stopping crime, shooting).
  • Data full of errors (remember gang members database where some of the gang members were less than 1 year old (source)).

Bad when (surveillance) system works as intended

  • Surveilling people of marginalized groups.
  • Suppressing protests and dissent.

Surveillance can still be used for bad causes even if machine learning models work perfectly. A fact that we have some data about some group of people raises concerns. How do we know that data collected will not be used for harm in future?

Some ideas and concerns about surveillance from The Moral Character of Cryptographic:

  • Surveillance is an instrument of power.
  • Mass surveillance tends to produce uniform, compliant and shallow people.
  • Privacy is social good.
  • Creeping surveillance is hard to stop due to interlocking corporate and govt interests (the-military-industrial-academic complex).
Source

But there are some things that we can do.

Towards solutions

Table of contents for next sections:

  • Proposals that won’t solve it.
  • What motivates companies to change.
  • Hope from history.
  • Privacy as a public good.
  • Regulating data collection and political ads.

Proposals that won’t solve it

Should we pay people for their data?

The answer is “no”:

  • Fails to treat privacy as a public good.
  • Fails to treat privacy as a human right.
  • Privacy becomes a luxury for the rich. They have more wealth so they don’t need to sell their data for living.
  • Virtually impossible for individuals to calculate value: spread over time, changes with aggregation (Abdulrahim, Famoroti).
  • Puts burden of time and education on consumer, not on firms that have all the power (@Iwillleavenow).
  • Would entrench the asymmetric and exploitative relationships between firms and individuals (@rando_walker).

Privacy should be seen as a public good and human right not as a property right.

What about differential privacy?

Note: “Differential privacy is a system for publicly sharing information about a dataset by describing the patterns of groups within the dataset while withholding information about individuals in the dataset. The idea behind differential privacy is that if the effect of making an arbitrary single substitution in the database is small enough, the query result cannot be used to infer much about any single individual, and therefore provides privacy.” (source). More information could be found here.

Differential privacy is generally not a solution. Critiques (from Rogaway):

  • Implicitly assumes that database owner is the “good guy”.
  • Treats harm as individual, not community-wide.
  • Rarely considers alternative of collecting less data.
  • Gives corporations a means for white-washing the risks.

Differential privacy doesn’t address most concerns related to surveillance and privacy.

What motivates companies to change

Facebook had a role in Rohingya genocide (source) by being a platform which easily spread hate speech in Myanmar (source and source). But that hasn’t really changed too much Facebook’s behavior.

Villages destroyed in Rohingya genocide. Source

Another case is Germany, which made anti hate speech laws stricter and fines higher (up to 50 million euros) and motivates Facebook to hire 1200 persons (source). Financial motivation is a thing that works for companies. Fear of losing money creates incentives to make real changes.

Hope from history

Industries can change. Example of car safety (source).

Early cars:

  • Sharp metal knobs on dashboard that could lodge in people’s skulls in crash.
  • Non-collapsible steering columns would frequently impale drivers.
  • Belief that cars were dangerous because of the people driving them.
Who would have survived crash with this car? Source

Industry didn’t want to talk about safety because it would scare customers away (would relate cars to deaths, accidents). There was active lobby against people who were talking about care safety.

But if we look at industry now we can clearly see that safety is a feature and used in marketing. There is a competition between car producers who can make safest cars.

Privacy as a public good

We should look privacy as a public/social good, not only as a benefit for individual.

“The infrastructure of mass surveillance is too complex, and the tech oligopoly too powerful, to make it meaningful to talk about individual consent.”

“To what extent is living in a surveillance-saturated world compatible with pluralism and democracy?” Maciej Ceglowski

As said before looking at privacy as a public good changes how we should treat it.

Source
Source

By Tawana Petty surveillance is not safety. We see that privacy, safety are public goods. Surveillance is not precondition to safety and in some cases it reduces safety.

Regulating data collection and political ads

Why doesn’t Facebook really change data collection policy?

Reason why Facebook won’t change. Source

But that doesn’t mean that things couldn’t be improved. “There was even bi-partisan agreement on how to regulate digital ads in a basic way. Similar to how tv ads are governed: more transparency, databases of content, need actual gov oversight.” (source) Regulation is not perfect solution but might still help to create some order in the Wild West of digital ads.

What else could be done to improve the situation (source):

  • Data collection should only be thru clear, concise and transparent opt-in.
  • People should have access to all data collected on them.
  • Data collection should be limited to specifically enumerated purposes.
  • Aggregate use of data should be limited.

If you are more interested in this topic than this is a list of experts to follow:

Source

--

--