Reflection on the Review of the ePrivacy Directive workshop

Elinor Carmi
11 min readApr 15, 2016


Privacy is definitely the buzz word in Brussels this month. With the European Parliament approving the General Data Protection Regulation (GDPR), the approval of the Passenger Name Records (PNR) Directive, and the Article 29 Working Party (an advisory body on data protection) opinion on the adequacy of the EU — US Privacy Shield draft — the discussions on privacy of electronic communication could not come at a better time. But while the other legislations receive a lot of media attention, the ePrivacy Directive remains in the dark. This Tuesday I went to the ‘stakeholder workshop’ that the European Commission organized to review the ePrivacy Directive, these are my thoughts.

A bit of background

The first reference point for the ePrivacy Directive is the 1950 European Convention on Human Rights (ECHR). The main relevant articles are Article 8 — Right to respect for private and family life, and Article 10 — Freedom of Speech. Fast forward to 2000, the EU published the Charter of Fundamental Rights, which introduced a new fundamental right Article 8 — Protection of personal data. Combined together, these three articles indicate that people’s autonomy in a democratic society is intertwined with their ability to freely express themselves in communications that must be kept confidential. In 2002, the Commission realized they need to adjust to the internet and introduced the Regulatory framework for electronic communications. It contained 4 directives: Universal Service Directive, Access Directive, Authorisation Directive, and the Directive on privacy and electronic communications (e-Privacy).

The ePrivacy Directive was always seen as a complementary legislation to the Data Protection Directive from 1995, as the latter only dealt with personal data and the former deals with all electronic communication. The ePrivacy Directive is commonly called the ‘cookie Directive’, because it is the first legislation to directly address cookies in the notorious Article 5(3) of confidentiality of communications. The last time the ePrivacy Directive was revised was 2009, and therefore the Commission realized they need to adjust it once more.

The stakeholder workshop: Towards a future proof ePrivacy legal framework

The main objectives of the review, according to the Commission, are threefold: strengthening trust in the ‘online world’ by increasing security and confidentiality of communications, boosting the Digital Single Market and delivering legislation fit for the digital age. The main goal of this workshop was to hear various interest groups views on the ePrivacy, while the Commission tries to maintain a balance between citizens’ fundamental rights and businesses economic endeavours.

The plenary session started with opening words by Rosa Barcelo, head of sector for digital privacy at DG CONNECT, who gave a brief overview of what the Commission is hoping to achieve by this revision and what are the main topics they focus on, while emphasising that they are open to hear other ideas and directions. Cristina Vela and Boris Wojtan from ETNO and GSMA who represent European mobile network operators argued for the need to push for the economic growth, emphasising job opportunities and innovation, specifically from data driven markets. Cornelia Kutterer, Microsoft’s Digital Policy Director, shifted the discussion to regulation of states in communication privacy, hinting to the Apple vs. FBI case. Then David Martin from BEUC who represent consumer organizations, argued for the remaining of the opt-in approach, and urged for a serious discussion on tracking, location and traffic data and other risks and problems that come together with big data.

Youtube not respecting Do Not Track (DNT).

Gwendal Legrand from CNIL (the French data protection authority) shifted the debate to other forms of measures that can be done on the browsers level (for example, privacy as default settings) and mentioned Do-Not-Track (DNT) as a possible technical measure. Frederik Borgesius a researcher at Institute for Information Law at the University of Amsterdam, suggested that the scope of the Directive needs be wider and include over-the-top services (OTT), such as Whatsapp and Skype, and pointed that the ePrivacy also deals with another fundamental right which is the freedom of expression. Estelle Masse from Access Now an NGO advocating for digital rights, argued that the Directive does not differentiate between different kind of cookies, specifically first and third parties, while other types such as super-cookies already exist and are resilient to users’ control mechanisms as they come back again after deletion. Masse mentioned their research on privacy concerns over cellphone tracking headers and pointed to the risks it involves. After this introduction there was a coffee break to get fuel towards the break-out sessions.

Let the (argument) games begin

Photo credit — Aurelie Pols

This time, the Commission decided to ask participants what they think in a different way. The 200 and something participants that came were asked to sit around tables of no more than 6 people in order to discuss one out of the five main issues the Commission proposed: 1) Ensuring consistency with the GDPR; 2) Scope of the ePrivacy Directive instrument; 3) Consent to access information stored in smart devices/terminal equipment; 4) Need for additional legal measures to reinforce security; 5) Anything else you want to raise (not covered by any of the preceding four themes). There were 4 round break-out sessions of 30 minutes, where people could switch to another topic or stay in the same topic but in a different table talking to other people. The tables were moderated by self-appointed ‘hosts’ who guided the group to talk about description of the problem, possible policy actions and their assessment (pros and cons including costs), and recommendations. This method was far better than presenting panels of stakeholders, as it gave a chance for people to talk and listen to various views.

However, the 200 or so people that attended this workshop were far from presenting a balanced perspective. Telcos and advertising lobbyists from across Europe were the majority of the participants, while human rights NGOs were a minority, though the most vocal on the workshop’s Twitter hashtag #ePrivacy. Academics, unfortunately, could be counted on one hand. The problem with such a majority of lobbyists, professional people who live and breathe Brussels’ politics, is that the discourse is then shaped according to their agenda. And while their position in this debate should be taken into account, it is important to consider other views as well. At almost every session the Commission had to ask privacy advocates to ‘balance’ tables as there were only telcos and advertising lobbyists sitting there. Though it is great that the Commission noticed and made an effort to create more balanced tables, it might be a good idea in the future to make sure more citizens and privacy advocates are invited and attending these discussions.

Two pillars

As the discussions in the tables progressed it looked like there were two main views representing the split of opinions between the commercial aspects of electronic communication versus human rights. It is important to note, though, that these views do not necessarily need to be oppositional to one another, and the notion that people’s fundamental rights can be respected in design, use, operation and management of media technologies should be a viable option.

On the one side the lobbyists arguments, although stemming from different needs, focused on three main things: first, repetitions of the ePrivacy with other pieces of legislation which are also being reviewed at the moment, such as the GDPR (they could cite all the 261 pages of the recent proposal by heart), digital content Directive, and the Directive on Consumer Rights — which according to them make most, if not all, of the ePrivacy Directive articles redundant. Second, need for harmonisation in implementation reducing the multiple interpretations of member states, data protection authority (DPAs) and national regulation authorities (NRAs). Third, avoid click-fatigue of people having to consent endlessly by clicking they agree.

On the other side, privacy advocates argued for a wider scope, and an important clarification who are the black-boxed third parties (to see that you can check out Bouncer browser extension). They also pointed to a need for more media literacy, and one place to object to the practice of profiling (and not the industry version of YourAdChoices). An important point was the notion of ‘informed consent’ in a situation where users refusal (to cookies) means they cannot access a service, which raised questions of market power (no alternatives?), user control and the burden of being informed placed on people.

Marketing Technology Landscape evolution, Taken from

Rethinking regulation

The title of the workshop may indicate one of the core problems with EU legislation, and with state regulation in general, when it comes to media. Trying to find a silver lining for media regulation for a ‘future proof legal framework’ is flawed. While the tech and advertising industry evolve according to Mark Zuckerberg’s motto of “move fast and break things”, state or regional legislators still work under old practices of law making procedures. A clear example of that is the GDPR, a Directive that was drafted in 1995, called for revision in 2012 and only now is being finalised. That is not including the transposition time which might take another couple of years. On the internet this is a long time, which means several versions of technologies development running wild and unregulated. Realizing that states cannot address such rapid changes through legislation or enforcement, the EU and the US adopted the self-regulation approach. However, such solutions are extremely problematic as trade associations do not operate as internet police and merely draft guidelines which companies can choose to respect or not. As we have seen with the law case of Max Schrems, agreements such as the Safe Harbor, which were meant to protect transfer of personal data outside the EU have failed.

Another issue is that the way that legislation works, at least in the media sector, is on defence mode; regulation comes only after technologies and services are already developed, racing ahead and making their existence a seemingly unavoidable reality. The same thing happened during the industrial revolution, whereby people and nature were severely harmed by ‘progress’ and ‘innovation’, and technologies were only later regulated. This kind of technological determinism, that constructs a narrow discourse about media as inevitable and a necessary evolution for economic growth is only one version, but it is presented as the only one. That is not to say that technology cannot be beneficial for society, but rather that the way it is designed, developed, used and regulated can take many forms, and should be open for discussion.

What if we start thinking about regulating media and communication like the pharmaceutical industry? Drugs and medicine need to be approved before they go into the market in order to ensure that they are safe and effective. But since media devices and communication on the internet are inseparable from our everyday lives, harming our digital bodies has effects over our physical bodies and vice versa. Cellphones are already the most intimate device that knows everything about us, and with the introduction of internet of things (IoT) and wearable, the false division between our online and offline lives prevent our human rights to be applied to all of the worlds we operate in.

So why should we care?

South Park, Season 18 episode 8 — Sponsored content.

One of the telcos lobbyists that sat beside me said that there are three kinds of people we need to think about here — the companies, the normal user and the privacy paranoids (I guess he meant me). But after Eduard Snowden revelations, and smaller but important cases such as the Google v. Vidal-Hall case and the Belgian Privacy Commission study, many people in Europe start to raise their doubts about the way tech and advertising companies are operating. For example, during the panel organized by the Greens party a week before the workshop, called The Reform of the e-privacy Directive: How to get it right?, the Norwegian Consumer Council (Forbrukerrådet) discussed their recent research — AppFail: Threats to Consumers in Mobile Apps. Their research examined 20 apps terms of use and privacy policies. In their findings they argue that most apps fail to respect EU privacy obligations and consumer rights. Specifically, they found that Apps can change terms at any time without notifying people, they track people even when the app is not used, personal data can be sold to third parties (which is hard to understand as they use ‘may’ and ‘can’ instead of being specific), deletion of data is difficult, and importantly that consent is flawed and needs to be reevaluated.

South Park, Season 15 episode 1 — HumancentiPad (or why you should read the terms of use)

In addition, the argument around privacy of personal data versus all communication or anonymising data seems to be irrelevant in itself, as MIT researchers recently showed, you can be identified by a just few data points. Importantly, what is at stake here is far beyond worrying about your dick pic; your data and communications can be used for price discrimination, and causing problems with getting a job, mortgage, health insurance (in the UK context, especially if the NHS will be privatized), house or car. We have profiles created from our online data and behaviour (sometime taken out of context), which we do not have access to and do not know how they are being or will be used. It also affects the kind of information you see online, which can and already creates filter bubbles.

So perhaps in order to bring the trust back, tech and advertising industry need to show that privacy and ‘innovation’ can come together. However, privacy should not become a product reserved for people who have more money, it should be ingrained in design, protocols, algorithms and use. The foundation for any kind of relationship is trust, and that comes in the shape of transparency, clearly indicating all the actors involved in the service they provide, how people’s data is being and will be used and importantly — to make fair terms of use which are the contracts that establish the nature of the relationship. No one questions the valuable contribution of the free services offered by companies such as Google, Facebook, Skype, Whatsapp and partly enabled by advertising companies (though we also pay for access and advertising with our broadband), but the price people have to pay is not made clear to them. Data are people, Meta data are their behaviour, so when talking about this new market it is important to point that people are the product.

On 11 April the Commission launched a public consultation inviting citizens, consumer associations or user associations, civil society organisations, businesses, industrial associations, public authorities, and academics to express their views through an online survey which will be open for 12 weeks until 5 July. This is EU citizens’ opportunity to express their opinion about this extremely important matter; But it is not only about privacy, it is about what you can do online which will have effects offline — It is about your autonomy. It takes only 10 minutes to fill out the 33 questions, which will have implications for the next years, but not for a future proof legal framework, because that’s not how media works.