Personal Tracking Apps — The Butterfly Effect

The end of our most basic freedom: the freedom of movement?

Deep Parekh, PhD
CEO-CF Reflections
13 min readMay 18, 2020

--

Photo by Linus Nylund on Unsplash

Constantin Pavléas and I know each other from our frequent encounters at CEO-Collaborative Forum, a senior executive and entrepreneurs’ community and live collaborative platform for business leadership and growth; their mission is taking companies to the next level and connect international CEOs who faced similar challenges, and to give them the opportunity to learn and grow from each other.

Constantin is the Managing Partner of Constantin Pavléas Avocats, a prominent Paris-based law firm in the technology and digital economy market space and personally a vocal advocate of personal data rights.

Covid-19 Tracking Apps and Government Governance

I recently caught up with Constantin because he’s been all over the news in France, talking about the implications of using personal data and geo-location tracking apps that have been in the news as potential technology solutions that governments are seeking in the context of containing and managing the spread of the Covid-19 virus across Europe and the rest of the world.

Constantin and I share a common interest in this area because it is the ultimate question of Governance in a larger discussion around ESG (Environment, Society, Governance) from the perspective of the age-old question of the poet Decimus Junius Juvenalis, “Quis custodiet ipsos custodes?” translated as “who will guard the guards themselves”. At my firm, Epistemy, we devote significant attention to this question. And more importantly, where will this lead?

Merriam Webster defines the ‘butterfly effect’ as:” a property of chaotic systems (such as the atmosphere) by which small changes in initial conditions can lead to large-scale and unpredictable variation in the future state of the system”.

Personal Data Tracking — the Butterfly Effect

We find ourselves in the midst of a global crisis with nowhere to hide! We were fairly blindsided by it, relatively speaking, and this has spiraled from a ‘China problem’ to a ‘global pandemic’ very rapidly, leaving everyone very little time to prepare. As governments try to get a hold of this crisis to get it under control, they are attracted to technological solutions that seem close at hand, with startups and industry goliaths (Apple, Google) both trying to flog their solutions. All these technological solutions seem to point towards tracking people and gaining insight into the spread of the virus through proximity algorithms that use personal and geo-location data to predict and track the spread of the illness. These solutions may seem innocuous at first, but using the butterfly effect analogy, there may be grave danger lurking in the unintended consequences of this opening up of personal and geo-location data.

CEO-CF arranged an web-cast interview where I had the privilege to ask Constantin to illuminate 27 leading CEOs of high-growth European companies on this issue of personal data tracking and its potential effect on our personal freedoms in Western democracies in the long-run. This article is based on this interview.

Deep (DP): So Constantin, can you give us a quick summary of the current situation?

Constantin (CP): We are in the midst of not knowing very much. Epidemiologists take pride in saying that they have understood the sequence to the virus very early on, but there is so much about the symptoms that we do not know about or why they come about. We don’t have exact figures either because the figures are so politically charged that politicians are unwilling to share exactly how many cases we have, how many deaths, and so on. Of course, there is a lot of mathematical modeling going on, but the truth is that we do not have the whole picture. Specifically regarding the question of technology, we know that China has extensively used technology to find the prevalence of the virus, and it would be interesting to see how European countries are trying to embark on the use of this technology to track the spread of this virus to manage it as we ease out of lockdown.

What kind of data are we talking about?

DP: So when we talk about technology and systems, the obvious question that comes up is about the data. What kind of data are we talking about here? just to get a sense of the breadth and depth of data — can you give us some examples?

CP: Sure, well, let’s take the Chinese example because this is the most extensive example. What China has done is a social credit system which monitors citizens and controls and scores their social behavior. So that’s a very intrusive system in the privacy of citizens. Now they’ve built upon that system. When the Covid-19 crisis emerged, they built upon the system to create a health code. They asked Alibaba — the e-commerce tech giant — to develop an algorithm that would assist the state in predicting whether a person is infected by the virus or risks to be contaminated.

DP: So when we talk about a person, are we really talking about a real person or an abstract data point based on a SIM card number? How do they define a ‘person’ in terms of a data record?

CP: Well, I know in China they know you very well: they know your social security number; they know what transactions you’ve done; they know your face because they do facial recognition; they know all your profile through the social credit social credit system where, every time you’re on social media and every time you do a transaction with your credit card, either online or physical transaction basically all this is stored. Your face is stored and your bio-metric data is stored in a central database. They know who you are, what you do, what you like — like Facebook does. But the thing is Facebook doesn’t know. It’s not the state layer. China has, I would say as much information as Facebook about an active Facebook user or Google user. But the problem is that it is the state that is involved.

What about GDPR considerations?

DP: But that is China — is it different in Europe, for example, where we have strict GDPR regulation?

CP: Yes, you have GDPR; but what is GDPR? GDPR is a policy regulation which makes the balance between privacy, freedom to operate and e-commerce. In a nutshell there is a balancing of conflict between different interests for sensitive data such as health data, that need to be addressed with particular caution and security. The principle for dealing with health data is consent. You cannot deal with this data if the person whose data it is does not consent. But there are a few exceptions: the most notable one to this is under GDPR in all the 27 EU countries, and let’s add the UK for the moment here, the principle is that if there is an important public interest that commands the use of these data or the process of these data for public good, then you can circumvent consent.

The political reality vs. practical complications

CP: But there is a political reality to this, which is the reluctance to use digital tracing systems in Western Democracies even without the consent of a person to fight against Covid-19. Let me share an example: there has been a lot of communication around the digital tracing apps that have popped up in Europe. Why? Because in Singapore in late March and in April, they started with an application that worked with bluetooth, to geo-localize a person. It was voluntary for a person to use, but this is another matter, legally speaking, of what is the meaning of ‘voluntary’ under such circumstances?

The problem with the application in Singapore is that it didn’t work because it was not ‘efficient’. When I say ‘efficient’ I mean that the application needed to have bluetooth operating actively in the background all the time, which is something that neither Apple nor Google, makers of most of the mobile devices today, did not have functionality for, nor common standards for interoperability. This functionality (having bluetooth operational in the background perpetually) raises all sorts of questions around security and device battery consumption. So the Singaporeans didn’t use it en masse — only one out of six used it, and with such a low use-rate, the application was useless, even though Singaporeans are a very law-abiding society.

So why is Europe excited about a technology that didn’t work?

DP: So, if it didn’t work in Singapore, why are European countries excited about this technology?

CP: Singapore’s technology trial was a month ago, which seems like an eternity today, with the rapid changes of events on a daily basis. By the time Singapore declared that the app didn’t work in early April, European countries were going through a steep rise in cases, and so the politicians were grappling for any answer they could find. So France, Germany, Spain, Austria thought that this was a really good idea. They needed to do something about the tragedy, so they said, we will copy Singapore’s app — it’s digital, it’s technology, we need to have something that will proliferate in the population easily and at low cost. They communicated around as if it were the FDA that was touting it as a panacea for the problem, something magical that would help with confinement and data gathering. So Europe also launched apps with voluntary registration.

What about the Google-Apple API technology?

DP: So what about this Google-Apple intervention in between now?

CP: Well, smelling the blood in the water, so to speak, Apple and Google quickly arrived on the scene, with an agreement to create a common standard and API that would allow both operating systems to have bluetooth operating in the background. A few days ago, they published specifications, conditions, terms of use, and that the IP would only be available to one organization per country that was working for the health authorities for the government. It is very important to note that the technology would work in a decentralized database structure. What does this mean? it means that Apple and Google disallow any country to keep their own data — it means that the data resides with Google/Apple and the device from which it was sourced, citing issues of personal data privacy, ironically.

What about the access to the underlying personal data?

DP: So can you tell us more about the effects of this agreement on the data, such as how long the information is kept for and how it is used?

CP: For the European countries, this will not be geo-location data. That was the red-line for the EU commission. They took this position, either rightly or wrongly, because countries that have geo-localized information are more efficient in dealing with the crisis. So, it only goes by smartphone identifier, and whether or not one particular smartphone has been in the vicinity of another smartphone whose owner during the last 14 days has been tested positive for Covid-19. When that happens, the person who is non-contaminated will receive a notification, if I am using the same application. So then, conceivably, my contact list can potentially be mined by those who have access to this app — say Apple and Google, and potentially the government.

The technology all seems to point towards helping people — so what’s the harm?

DP: So all this data and what we’ve been told as honest, law abiding citizens, is that this is all for our own good — I would be informed in real-time if I were in contact with someone who has tested positive for the virus — so what’s the harm?

CP: Indeed, what is the harm? Well, some might say that I prefer my freedom of movement, which is constrained right now during this confinement period, than having my phone registered in a government database somewhere to inform me if I have been in proximity of someone that has tested positively for the virus. However, there are a lot of questions — the government could say that because you have been in proximity of someone who tested positive, you MUST be in confinement until you are confirmed for contamination or not — how long might this take considering that tests are not widely available? If you do not agree, what are the consequences? would your employer refuse you access to the work premises, since knowing this data would make an employer criminally liable for allowing you on site if they do not take the appropriate safeguards to protect their other employees. So then the question arises, is this really ‘voluntary’? If there are negative consequences of not using the application such as being stripped of access, is it really ‘voluntary’?

In what ways can this technology be used in the future?

DP: Let’s imagine that we’re past this Covid-19 event and a couple of years into the future; what can you see this technology being used for in the future?

CP: If it is used in a coordinated way among European countries and more at a regional level, it can be used to help us fight against the next waves for this or future viruses. We have had H1N1 and other viruses of pandemic proportions in the last 20 years. There is a pattern that we cannot ignore, so there will be a next one.

DP: But let’s go beyond pandemics — I’m just trying to push the boundaries of our thinking — what might we see this type of personal tracking technology be used for. Do you see it being used for a sort of ‘Minority Report’ type of application, where the state knows that you’re about to commit a crime?

CP: Well, I would not say that this specific type of app would be able to do that, but the Chinese convergence of personal data, facial recognition, and geo-location, combined with social scoring is heading in that direction. If we take the example of digital tracing, well, tomorrow you could use this for counter-terrorism initiatives. You might be able to use this for riot-control in certain countries where they consider riots as public unrest. The Chinese algorithms run so deep that they could in fact be used for a Minority Report type of scenario, where the government can predict when and where you might be about to commit a crime. You can predict the profile of the person under different conditions, based on past behavior, my social media posting, the circle of people who is around me both physically and in my social interactions. This is why it is so important to have rules in our democracies; rules that reflect our values, and so if we allow a digital tracing today, we have to be clear about what are the future implications, with transparency from our governments.

CP: But in Europe, this is somewhat contained when we talk about the European situation: France, Germany, Spain, Austria, and Italy are working on different applications, with no plans for interoperability between them. So, I could be a French national traveling to Germany and their app or government system would never be able to track me or capture my data. So fragmented applications without interoperability limit the butterfly effect outside ones own boundaries.

What about the butterfly effect?

DP: So, how does this transpire into the butterfly effect?

CP: Think about it — if you have Apple or Google Home applications which control your door lock and your windows, and if you use this type of tracking application. Then, it is conceivable that the government might give an alert about you being a ‘suspect’ for something, and issue orders to Google or Apple that might permit them to disable you from leaving your house until the matter is cleared up — essentially a house arrest executed through the application’s capabilities combined with technologies that are already in place. But this could also be used to prevent mass-contamination. This potentially infringes on some of our core freedoms — such as the freedom of movement — freedom to access due process, and beyond. You might be in a situation where you are presumed guilty until proven innocent, so to speak. If these applications can be combined with physical checks, then they can uphold our rights, but used recklessly, will risk trampling all over them.

Conclusions

DP: Any final thoughts?

CP: We need to put the checks and balances in place for such tools and tracking algorithms. Else we risk a massive rupture of personal freedoms: for instance, insurance companies might access this database and use it to negatively impact the people. Employers might do the same, as might banks or other institutions. The technology can be of tremendous value if used responsibly, but we must rely on our governments and institutions to create the right governance mechanisms to safeguard our personal data and privacy, at the end of the day.

More about Constantin Pavleas

Constantin (Pavleas) is a regular editorialist on radio on digital technology issues. He has written several articles on legal issues raised by the use of digital tools against Covid-19 and the interest of tech giants in the e-health race. He also has been interviewed by major French TV and news media, such as:

  • Le Figaro
  • LCI news
  • BFMTV
  • CNews

Constantin has called for the creation of an ethical committee to advise the government on the use of digital technologies in the fight against Covid-19 — apparently he has been heard as the State Secretary for Digital Technology has announced the creation of a “comité de suivi” including IT lawyers, members of parliament and NGOs. Constantin also calls for the creation of a European Union of Health, with primary decision power on public health matters

Constantin founded Pavleas Avocats in 2001, a law firm specializing in information technology, data privacy and intellectual property (IP). Constantin advises clients on IP strategy, complex IP and technology transactions and on data protection. He regularly engages as IP counsel in M&A transactions.

Constantin teaches information technology law and running negotiation workshops at postgraduate level at HEAD (Ecole des Hautes Etudes Appliquées du Droit). He is Coordinator of the Digital Law and Intellectual Property LLM Program at HEAD.

Also, the Lefigaro.fr articles set out below have posted excerpts of Constantin’s interviews. For clarity, he did not author these articles.

https://www.lefigaro.fr/secteur/high-tech/stopcovid-la-cnil-dit-oui-sous-conditions-a-l-application-de-tracage-20200426

https://www.lefigaro.fr/actualite-france/coronavirus-une-majorite-de-francais-sont-opposes-au-tracage-numerique-20200412

https://www.lefigaro.fr/politique/le-scan/coronavirus-les-deux-semaines-ou-l-executif-s-est-pris-les-pieds-dans-le-tapis-sur-le-projet-stopcovid-20200429

https://plus.lefigaro.fr/tag/tracage-numerique

https://plus.lefigaro.fr/tag/tracage

https://plus.lefigaro.fr/tag/stop-covid

https://plus.lefigaro.fr/tag/singapour

His relevant radio editorials are the following:

https://euradio.fr/2020/03/04/big-data-et-coronavirus-ledito-de-constantin-pavleas/

https://euradio.fr/2020/02/05/la-subordination-europeenne-face-aux-geants-americains-de-linternet-constantin-pavleas/

--

--