From Amazon’s facial recognition moratorium to the rejection of Chinese apps: why are Big Tech and nation states alike withholding services?

Image for post
Image for post
Protests at the US Consulate in Hong Hong, 2019. Unsplash

On the 17th of June 2020, aljazeera.com reported that there were 20 Indian soldiers killed in a violent skirmish against China, due to an ongoing border dispute. India chose to retaliate digitally, by banning 59 Chinese apps, including TikTok and WeChat, from Indian app stores. In their government announcement, they stated that their reasons for doing this were because these apps “engaged in activities … prejudicial to [the] sovereignty and integrity of India.”

That makes sense — apps like TikTok didn’t get where they are today by not intrusively harvesting user data for profit. However, I feel that the more likely desired affect of this move was not to protect Indian citizens from underhanded mobile apps, but to hurt China’s economy. India was TikTok’s largest overseas market, after all. Not only was this an effective move, but a relatively easy one: why bother the UN with a cumbersome, old-fashioned economic sanction when you could just ask Apple and Google to exclude a few dozen apps from your countries’ stores? Why indeed! …


Privacy is too big to understand. But do all of us really need to understand it?

Image for post
Image for post
Image by me

It’s boring, abstract, and honestly ‘privacy’ isn’t even the right word. After being immersed in the data privacy space for a solid year, I’ve noticed some common themes in rhetoric and attitudes — as well as problems where there still aren’t solutions, and summarised these into five main points:

1. No one cares about privacy

Here’s something that privacy advocates desperately need to understand: privacy is unimportant and expensive. Sure apps like TikTok are built with intrusive SDKs, and have the most aggressive recommendation algorithms out there, BUT: the people using TikTok don’t care. They just want some fun social media to briefly distract them from the much more pressing issues that underpin their lives. Such as earning a living wage during a global pandemic. …


My likeness doesn’t belong to anyone but me — stop turning it into data

Image for post
Image for post

Picture this: you’re a delivery person going door to door dropping off packages. You’re over-worked and under-paid. You arrive at a house and ring the doorbell. You wait patiently for someone to answer. Everything is normal. Oh, except: you’ve just been profiled on a facial recognition database owned by Amazon. Wow, what a time to be alive.

The scenario above has been made possible by Ring, an Amazon-owned company who make smart doorbells for Karens. These doorbells are powered by Rekognition, Amazon’s facial recognition technology. Which works very well, actually (at automating the already heavily embedded systemic racism that exists in our society). Amazon have been working with US police departments to help them solve crimes and protect people create a private surveillance network and cultivate a culture of paranoia. They’ve trained police to recommend the doorbells to people who don’t have them already (why isn’t that illegal?)


Instead the UK government are working with Faculty to create a comprehensive social graph

Image for post
Image for post

Today NHSX, the digital arm of the NHS, are piloting their contact tracing app in the Isle of Wight. Before we dive in to the privacy concerns I would like to point out that the app itself is shoddy — it will not work properly as a contact tracing app, and therefore, it probably won’t work very well as a secret spying app either. Once again, the sheer ineptitude of this government stops them from even being evil effectively. Classic.

Let’s start with the basics: this app is not even up to standard to feature in the NHS app store. According to the Health Service Journal, NHS digital haven’t been able to get their hands on a solid codebase to test it out because “they keep changing it all over the place”. …


Isn’t it funny that this whole time, we weren’t aware that the UK government could do a better job than Google and Apple combined?

Image for post
Image for post
Matt Hancock thinking of his great idea for an app

Okay just for a second imagine that this whole Coronavirus thing doesn’t exist, and that you get to be an early adopter of a cool new app. As an early adopter of this app, you get to choose:

☝️OPTION ONE: the app will be built by both Google and Apple, working together, in an unprecedented tech giant mega-merge, the likes of which the world has never seen.

OR

✌️OPTION TWO: the app will be built by the UK government, a group of people who get their emails printed out for them by their interns.

Gosh what a TOUGH choice. Of course, when it comes to the contact tracing apps that both parties are building, you don’t get a choice at all. UK residents will be stuck with a clunky piece of rubbish that doesn’t work, and the rest of the world will be at the mercy of Apple and Google, just like they have been already for the past decade. …


Here’s how Big Tech’s main players are getting their privacy messaging wrong…

Image for post
Image for post

When the GDPR bomb finally dropped in 2018, there was a desperate scramble among Big Tech companies — whose business models revolve around the exploitation of user data — to make some changes. But are these changes only cosmetic?

This falls under what I like to call the privacy promise — the persistent messaging we get from Big Tech about how much they value our privacy.

Google’s privacy promise is about control

Google’s privacy promise is that you’re in control of the data you produce by interacting with their services: you have the power to look at it and delete it. …


Health data is arguably the most valuable data we have — which is why companies want it

Image for post
Image for post

We all know that there’s a lot of money in health, because humans have frail, sensitive bodies which are prone to disease and ageing — all of us need to access healthcare at various stages all through our lives.

But what kind of world are we walking into when Google suddenly becomes your gateway to healthcare? Can a large tech company replace a hospital or doctors office?

Should Google be your doctor?

Google, like others, has very recently made significant moves to cement its place in the so-called ‘healthcare market’. …


The current rhetoric is that data privacy is something you should care about — but in order to truly keep data private, you actually have to value it.

Image for post
Image for post

Earlier this year, Twitter CEO Jack Dorsey had his Twitter account hacked into via SIM swapping. How did this happen?

Because SIM swapping is extremely easy: all it takes is one phone call to the victim’s mobile carrier, to request they move their number over to a new sim. Once this is done, the hacker (or Cyber Criminal if you want to be cool), can intercept two-factor authentication texts and get into a social media account by doing a password reset.

The person who hacked @jack was probably not in it for the money — it was likely just a flex. But there are communities who buy and sell ‘valuable’ user accounts. In this episode of Reply All, a woman had her Snapchat account hacked in the same way, and her handle sold for $1500 on a forum called OG Accounts (great name). …


Is the cost of refusal greater than the cost of participation?

Image for post
Image for post

Here’s something we must understand about free platforms such as Facebook: it is an ad network, designed to take in as much information about humans as possible, so it can thrive on a bed of behavioural data — in a lot of ways, this is a threat to our autonomy, because free platforms like this profit from our private experiences.

Here’s the problem: Facebook is useful to a lot of people. So is Google. So is Amazon. But, is all the tracking and surveillance worth what they give us in return, even though many of these platforms don’t cost money to engage with? …


If all your behaviour is recorded digitally every day, so is your health

By Georgia Iacovou

Image for post
Image for post

If you’ve read Incognito before, you may have come across April, our fictional dummy for privacy related experiments. Last time we saw her, she used an app that was putting all her data on a blockchain, making her unable to delete it. We also had a look at what might happen if her data was made more easily accessible to her, the government, and her employer (sucks for April).

Consistently gathering behavioural data on a mass scale has a lot of potential — this data is very versatile, and is used by ad networks to predict what we will do next, so they know what products to sell us. …

About

Georgia Iacovou

Writing about data privacy and tech ethics; dismantling what Big Tech firms are doing in a way that’s easy to understand

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store