Five loopholes in the GDPR

Robert Madge
24 min readAug 27, 2017


[If you have questions, please feel free to contact me at:]

The EU General Data Protection Regulation (GDPR) is an impressive act of legislation. Some people call it a great law.

The GDPR sets out to provide individuals with protection of their personal data. Secondary goals are to balance the rights of individuals against other rights (including public interest) and to ensure a consistent rule of law for personal data throughout the EU.

These goals had to be translated into words that can be legally enforced. The law has ended up with a lot of words — more than 55,000 — the result of four years of negotiations between the many interested parties. Naturally, there are imperfections.

Some businesses and others don’t like the law and would prefer to avoid it when they can. They will be exploring the imperfections, looking for loopholes.

This article takes the approach of an ethical law hacker — but only theoretically: looking at where it could be possible to get around the law.


#1: ‘Controllers’ outside the EU

The GDPR is meant to protect people in the EU when their personal data is controlled by organisations outside the EU, but it may not. Weaknesses in the wording of the law give the chance for organisations to collect data and ignore the GDPR. Once data ‘escapes’ from the GDPR, it can be passed on to others without legal protection. See the explanation below…

#2: Data losing GDPR protection

Even if data is collected and processed legally under the GDPR, it can be transferred to others and then escape the protection of the law. See the explanation below…

#3: Invisible data chain

If organisations obtain data indirectly, in most cases (and excluding loophole #2) it should still be subject to the GDPR. However, the application of the law in these cases may be only theoretical, particularly in the case of “data chains”. See the explanation below…

#4: Inferred data

Personal data is any data related to a living person. The GDPR gives obligations to processors of the data and it gives rights to individuals. But, even when the data stays personal, users may lose a number of rights. Organisations can take advantage of this. See the explanation below…

#5: Legitimate interests

It may seem reasonable that organisations should be able to process personal data if they have a good reason to do so, after considering the interests of the individuals involved. However, the way this will work in practice means that many organisations could see it as a loophole in the law. See the explanation below…

Loophole #1: ‘Controllers’ outside the EU

The GDPR states a couple of times in its recitals that protection of personal data of natural persons should take place “whatever their nationality or residence”. The previous data protection directive covered any organisation processing personal data in the EU but did not guarantee the protection of every person in the EU (when their data was processed by an organisation outside the EU). The authors of the GDPR set out to change this, to cover any organisation in the EU that handles personal data and any individual in the EU whose personal data is handled by an organisation, wherever that organisation is based.

The reasoning is obvious. An individual can enter a website and give their personal data, without knowing where their data will be processed. The legislators wanted to give people the assurance that EU law would protect them in all cases.

Take the analogy of going to buy something at a shop in the EU. The purchaser is protected by EU consumer law and doesn’t have to think twice about it — the shopkeeper cannot say “this product is from India and therefore we apply Indian laws of product safety and consumer rights”. The GDPR has set out to create the same situation in the online world: you are protected, full stop.

The devil is in the detail, in the wording of the law. The GDPR states that its territorial scope includes the processing of personal data of someone in the EU by organisations outside, “where the processing activities are related to the offering of goods or services” to that person. The phrase “the offering of goods or services” is subject to different interpretations.

You could reasonably ask, why doesn’t the regulation just say “related to the marketing or supply of goods or services” or perhaps even simpler “related to a data subject in the EU”? However, the GDPR was written by lawyers and this wording of “offering” originates from legalese applied in the context of EU competition law. There is ample case law regarding its interpretation, based on the definition of “undertaking” meaning an entity that carries on an “economic activity” and that the measure of an economic activity is “offering goods or services” (even if no payment occurs). The case law shows a broad interpretation of “offering goods or services” to cover sales, supply and even purchasing.

Therefore, the original drafters who decided to put in the words “offering goods or services” probably intended to cover any marketing or commercial activity that engages an individual in the EU (with the words “irrespective of whether a payment of the data subject is required” added later in the drafting process to ensure that it covers the new business models of online services such as social media).

Nevertheless, when the regulation was negotiated — and there was a lot of lobbying — words were added to a recital (the ‘contextual’ paragraphs before the main articles of the regulation) which took a different point of reference for interpreting “offering goods or services”. Guided perhaps by the idea that an “offer” takes place before any transaction, the following words were added to Recital 23:

In order to determine whether such a controller or processor is offering goods or services to data subjects who are in the Union, it should be ascertained whether it is apparent that the controller or processor envisages offering services to data subjects in one or more Member States in the Union. Whereas the mere accessibility of the controller’s, processor’s or an intermediary’s website in the Union, of an email address or of other contact details, or the use of a language generally used in the third country where the controller is established, is insufficient to ascertain such intention, factors such as the use of a language or a currency generally used in one or more Member States with the possibility of ordering goods and services in that other language, or the mentioning of customers or users who are in the Union, may make it apparent that the controller envisages offering goods or services to data subjects in the Union.

This wording says that the test is based on whether the organisation “envisages” offering goods and services, not on whether it does in fact offer, or supply, or simply obtain personal data.

This wording originates from a legal judgment that determines in which jurisdiction within the EU (in other words, in which EU country) a case should be heard in a court of law. This case, combining two different actions known as Pammer and Hotel Alpenhof[1], was judged in 2010 by the CJEU and therefore forms part of EU case law. However, the nature and effect of this case is quite different from the context used in the GDPR. Firstly, the court was asked to determine in which jurisdiction a court case should be held, not to determine the territorial scope of application of a law. Secondly, the result was to make a defendant’s claim subject to one of two alternative member state courts, not to either award or deny the protection of a law. (Note: The GDPR contains explicit provisions for determining the jurisdiction, both for administrative and judicial processes.)

We can imagine the scenario of, let’s say, a Chinese company that markets a broad range of products and services that are sourced from third parties. This Chinese company creates a global portal, with a large catalogue of items. The catalogue is accessible worldwide, and might include European languages — and possibly a currency conversion tool to see prices in Euros — thereby presumably constituting an “offering” to people in the EU, but only in the sense of marketing, not selling products. Potential customers would browse this catalogue anonymously and, when they decide to buy a product or service, they click the appropriate link. This link would then take the individual to the website of an independent third-party company, outside the EU (whose website perhaps is not at all EU-centric, being written in English and Chinese, with prices in US dollars). The personal data interchange then takes place with this third-party company.

Following GDPR Recital 23 could put personal data outside the scope of the law

The original company with the catalogue portal would not handle any personal data, so it would not be subject to the GDPR. The third-party company would have deniability about offering goods or services to someone in the EU, since it simply placed its products in a global catalogue. Any personal data given over would escape the scope of the GDPR.

Even without this slightly complicated scenario, there is a problem in the detail of the GDPR wording: “where the processing activities are related to the offering…” Effectively this does not cover any processing of personal data that arise from the “offering”, but only the processing activities related to the offering. If “offering” is interpreted narrowly, to only being the phase prior to a transaction or provision of a service, then all the processing activities that take place later — when the most personal data would be obtained — are not covered.

So, a global company wanting to find a loophole in the GDPR can set up a marketing company in the EU. Having done the minimal personal data processing needed to obtain customers, these customers are then transferred for transaction fulfilment, including personal data handling, to a non-EU business.

Once the personal data are “outside the law”, they stay outside the law — if not transferred back into the EU. A non-EU company with personal data, and not subject to any restrictions under the GDPR, could sell on the data to any other non-EU company.

The only evident way to block this loophole is for the CJEU to rule that Recital 23 is a misinterpretation of the purpose of the law and that “offering” should have the same interpretation as applied in competition law.

It should be noted that non-EU organisations might still become subject to the GDPR due to Article 3.2(b) that covers when processing activities are related to “the monitoring of their [the data subjects’] behaviour as far as their behaviour takes place within the Union”. This would cover tracking and profiling of individuals in the EU.

Note: If you want to see a detailed discussion on the territorial scope of the GDPR, please see my article GDPR global scope: the long story.

The proposed ePrivacy regulation, that is also due to come into force next year but is still at the drafting stage does not have this problem of territorial scope. The current draft covers “the provision of electronic communications services to end-users in the Union”, “the use of such services” and “the protection of information related to the terminal equipment of end-users located in the Union”. It does not have separate rules depending on the location of the provider (except that a non-EU provider has to designate a representative in the EU). It would cover, for example, any use of a website by a person who is in the EU (and any automated personal data collection, such as via cookies).

Loophole #2: Data losing GDPR protection

Personal data processed in the EU are clearly covered by the GDPR — no problems here. However, there are further consequences due to the way Article 3.2(a) describes the territorial scope.

Since, in the case of a non-EU data “controller” (an entity that processes data), it is only subject to the GDPR when the processing activities are related to the offering of goods or services to the individual in the EU (or monitoring the behaviour of the person), the same personal data could be processed for another purpose without being subject to the GDPR.

Take an example of a US-based company that collects personal data from someone in the EU. The company complies with the GDPR and follows a valid consent process to get the agreement from the data subject, saying “Please give your permission to process your data so that we can offer you a tailor-made service.” The individual gives their permission, the company carries out its processing accordingly.

Then the company sells the data to a third company, also in the US. This onward transfer of data would normally count as processing under the GDPR, but since this is a processing activity not related to the offering of goods or services to the individual, it is now outside the scope of the GDPR.

Of course, the company buying the data is not subject to the GDPR since its processing of the data will also not be related to an offering of goods or services (and certainly not to processing activities related to this).

The personal data will have leaked out of the GDPR scope. The only hope would be to try and ‘catch’ this data again, if and when it is used to direct an offering to someone in the EU. It could be very difficult to spot that this is happening via targeted advertising, and even harder to find the controller responsible.

It could be difficult to close this loophole. However an EU court would look to the purpose of an EU law when making a judgement and not just to specific wording of a provision, so it is perhaps feasible that the CJEU could determine that the words “related to” in the phrase “processing activities related to the offering of goods or services” should be interpreted to include “arising from” the processing activities — therefore saying that any personal data collected during the original processing activities would continue to enjoy the protection of the law when used for alternative purposes.

Loophole #3: The invisible data chain

The intention of the GDPR is that individuals will always know what is happening with their data and will be able to exercise rights over this data. For example, data subjects have the right to access data held by a controller, correct errors, object to processing and request erasure. The starting point for exercising these rights is given in Article 15: “The data subject shall have the right to obtain from the controller confirmation as to whether or not personal data concerning him or her are being processed.”

However, how does the data subject know to which controller it has to ask this question?

Transparency is meant to start from the point of data collection. The controller has to provide a set of information to the data subject at this time, in accordance with Article 13. One of the items of information that the controller has to provide is “the recipients or categories of recipients of the personal data, if any”. A “recipient” is a third party to whom the controller discloses or transfers data.

The controller is not obliged to provide the names of recipients since it can choose to only provide the “categories of recipients”. Even if the individual does an access request, the controller can still limit the response to categories.

Under Article 14 of the GDPR, each recipient controller would have to inform the data subject within a month of receiving the data. But what if the recipient doesn’t do this?

There might be valid reasons why the recipient controller does not provide this data. It might not be able to identify the data subjects whose data it has (and it has to have a high level of confidence that it doesn’t provide the data to the wrong person). Even if the individuals are identifiable, the recipient controller may not have their contact details. Nothing in the GDPR obliges a controller to provide enough information to a recipient to allow it to comply with its (the recipient’s) obligations under the GDPR and the controller itself is no longer legally liable (except in the case of joint controllers). NB: There is a requirement for a controller to try and pass on data subject rectification and erasure requests and a special condition to inform controllers about an erasure request in the case of data that have been made public.

If an individual does discover that a company is using its personal data (for example, if they receive a direct marketing communication from a company they do not recognise), then the person can make an access request under Article 15. However, it might be impossible to find out from where that company got the data since the obligation on the company is only to provide “any available information about the source”. Furthermore, Recital 61 says “Where the origin of the personal data cannot be provided to the data subject because various sources have been used, general information should be provided.”

The subject of non-identification of the individual concerned is covered by Article 11. A controller that cannot identify the data subject is absolved from having to respond in detail to a data subject’s requests — except to tell the data subject (“if possible” to do so) that it cannot comply due to lack of identification. The individual can provide the controller with further information to aid the identification, but how this would work in practice is not clear. Although Article 11 does not exempt the controller from complying with Article 21, the right to object, nor to the provision of this article that requires a data subject to be informed of their right to object, at the latest at the time of the first communication, this right has no value if the controller never communicates directly.

Then, of course, there will be controllers that decide to ignore their obligations under the law. Unless they communicate directly with the individuals whose data they have, or do something flamboyant with the data that attracts attention, complaints are unlikely and their non-compliance may well go undetected.

In reality, there are currently long ‘data chains’, with personal data held several steps removed from the data subjects. Personal data is bought and sold like a commodity and whole industries, such as adtech, have developed on the back of this data interchange. Despite the clear obligations under the GDPR, these invisible data chains are likely to exist for a long time: perhaps few businesses in the chain will have the motivation or the means to comply and make Article 14 notifications. In many cases, the lawful basis for processing this data does not exist — or was based on pre-GDPR consent implementation — so businesses will not want to declare that they have the data.

In the best of scenarios, there will probably be a continuing black market for personal data.

Closing this loophole would require proactive steps by supervisory authorities to study data chains in operation and pinpoint businesses for enforcement action without waiting for complaints. Also, the modalities for informing data subjects under Article 14 could be facilitated: by encouraging controllers in direct contact with data subjects to be a conduit for Article 14 notifications from the recipients to whom they provide data and introducing a special provision in the forthcoming ePrivacy regulation that explicitly recognises, subject to conditions, the use of unsolicited communications in order to comply with GDPR notification requirements.

Loophole #4: Inferred data

The term “inferred data” is not perfect — other phrases are sometimes used, such as “derived data”.[2] It means data that is not in the original format that was collected, but which could still be considered personal data because it is related to an identifiable person.

Examples could range from simple categorisation (such as when a person says that they live in postcode 10963, Germany, and their file is automatically tagged with “Berlin”) to cases where there are human comments (such as when a doctor examines a patient and writes “symptoms of bronchitis” in the file). It could be a car navigation service that classifies a person as a “fast” driver, based on observed behaviour, in order to estimate driving times for that individual; it could be a tag to indicate that someone has a propensity to be susceptible to food-related advertising if presented before 9am.

This kind of data can in some cases fall under the definition of ‘profiling’, that is explicitly covered in the GDPR in the context of direct marketing or when automated decisions are made on the basis of profiling that have a legal or significant effect on the person. (Another little glitch in the GDPR is that a person can object to direct marketing based on profiling and have it stopped immediately, but there is no obligation on the controller to inform the data subject that any profiling is taking place — unless it produces “legal effects…or similarly significantly affects him or her” — despite a recital that does not include this limitation.)

A 2014 CJEU judgment (YS v Minister voor Immigratie)[3] determined that a legal analysis of an individual is not “in itself” personal data, even though it contains personal data, and therefore the data subject was denied the right to get a copy of this analysis. This conclusion was on the basis that the analysis was an assessment of how an external factor (in this case, relevant laws) applied to the situation of the data subject, not information related to the data subject. A further reason was that an individual’s right of access to their personal data was in place to allow the person to verify the accuracy of the personal data and that it is processed in a lawful manner (and thereby exercise other rights, such as rectification or erasure), and access to the analysis was not necessary for this purpose.

This appears to conflict with the GDPR (a subsequent law), in which Recital 63 states that an individual should have the right to access their personal data, including “access to data concerning their health, for example the data in their medical records containing information such as diagnoses, examination results, assessments by treating physicians and any treatment or interventions provided”. An “assessment” by a physician would appear to fall into the same category as the “analysis” that was the subject of the 2014 legal case.

However, recitals only serve to convey the purpose and help to interpret the articles of an EU law — a recital cannot derogate from the actual provisions (articles) of the law. In Article 15.4 (covering a data subject’s access rights), the GDPR states “The right to obtain a copy [of the individual’s personal data] referred to in paragraph 3 shall not adversely affect the rights and freedoms of others.” This is backed up by further words in Recital 63: “That right should not adversely affect the rights or freedoms of others, including trade secrets or intellectual property and in particular the copyright protecting the software.”

The “others” here can be legal entities, such as the controller of the data. Any time a controller combines personal data from an individual with data from another source, or transforms it through an algorithm, they could use the reasoning from the 2014 “YS” judgment and refuse to provide a copy of this data.

Taking this a step further, an organisation wanting to keep personal data about individuals, without being subject to many of the obligations that come from GDPR rights for data subjects, could simply transform the data by some method (probably by a ‘proprietary’ algorithm, to increase the levels of legal defence). This method could even be reversible, allowing the organisation to re-create the original personal data if wished — in the meantime, deleting the individual’s original raw data on the principle of data minimisation.

The resulting data would still, probably, be legally recognised as containing personal data and so the organisation would need to observe the provisions of the GDPR: processing the data lawfully, only doing so for the defined purpose, minimising the data held, keeping it up-to-date, minimising the storage time, maintaining security of the data and being ready to be held accountable. However, it would only need to tell data subjects (in response to requests) about the categories of personal data held but not the details — and would not have to provide a copy (assuming the interpretation of the law given above).

If individuals cannot access a copy of the data, they will not know exactly what data is held. They will not be able to correct inaccuracies, nor of course contest the ‘inferences’ made by the data controller. Even if they had given consent to use of the original data, they would not be able to obtain data portability of the inferred data.[4] They would be left with the right to withdraw consent, or to object to all data processing or to require erasure of all records, but this would be an “all or nothing” result that might not be practical for an individual — implying, for example, withdrawing entirely from a social media platform.

Closing this loophole probably requires further case law on the interpretation of “personal data”, particularly in the context of the GDPR rather than the 1995 data protection directive. Future case law on the meaning of “legal effects….or similarly significantly affects”, in the context of profiling, would also be relevant due to the explicit rights given to individuals in this situation.

Loophole #5: Legitimate interests

The concept of “legitimate interests” of the organisation processing personal data has not changed from the 1995 data protection Directive, and the wording of the provision in the GDPR is almost identical. It requires that the controller balances its own (or a third party’s) legitimate interests against the interests or fundamental rights and freedoms of the data subject. Unless the data subject’s rights override the controller’s rights, it can proceed with the processing.

In the past, most businesses did not elect to use “legitimate interests” as the lawful ground for using personal data because it does require an assessment of the balance of interests and this could be subject to later challenge. In most EU jurisdictions, there has been a lax regime applied to the use of ‘consent’ from individuals — with it normally being sufficient to give a data subject an opt-out option covering a broad usage of the personal data — so businesses having tended to go this route. Once they had a ‘consent’, they didn’t need to provide any further justification for what they were doing.

This is changing. The GDPR definition of consent is more demanding than that of the 1995 directive and — crucially — the GDPR is a regulation that automatically applies across the EU. The 1995 directive had to be transposed into national legislation, which gave a lot of scope for different interpretation in different countries. Some national legislation did not even include the definition of data subject consent that was specified in the directive.

In addition, the supervisory authorities that ensure organisations comply with data protection law have indicated that they are going to take a strict approach to judging whether consents have been obtained validly. From May 2018, all consents must be according to GDPR definitions. Since this means specific ‘opt-in’ consents, businesses can assume that they will receive far fewer consents than under the old regime.

Organisations handling personal data, particularly those that are in the business of marketing, are in general revising their data protection procedures to use the claim of legitimate interests instead of consent.

Processing personal data on the basis of legitimate interests should not be a loophole. It has been long accepted as valid, since there are many situations where individuals would accept the processing of their personal data and may not want to be bothered by the mechanisms of giving consent. They have not lost their rights in this case (apart from the new right of data portability, see Data portability is a false promise, and a right to withdraw consent without question). They can still request access to their data, object to processing and pursue other rights such as rectification and erasure.

However, the problem is in the procedure.

Data controllers, that might be commercial businesses, should have a good idea of their own legitimate interests — to make money can be one of them. They have to balance their own well-defined legitimate interests against the diffuse and varied interests of the mass of the data subjects, probably applying a single approach for all potential subjects. As was indicated by the Article 29 Working Party, in its opinion on legitimate interests[5], the interests of the data subjects are highly dependent on context and may depend on the personal circumstances of the person.

Under the GDPR — except in cases where there is a high risk to individuals, such as in large-scale processing of sensitive data — the data controllers independently make their assessment of the balance of interests, without supervision and without consulting with the data subjects themselves. The controller has to inform the individual (under Article 13) that it is using a legitimate interests ground for the processing, and it has to describe its own (or a third party’s) legitimate interests, but it does not have to say what interests of the data subject it has taken into account, nor how it has calculated the balance of interests.

If the individual makes a ‘subject access request’, for details of the personal data processing that are taking place, at this point the controller does not even have to tell the person that the processing depends on a legitimate interests assertion. (It is presumably assumed that the data subject was notified at the time of data collection.)

The recourse is meant to be via the right to object, according to Article 21. This is the only way a data subject can find out how a data controller decided that its own legitimate interests were of greater value than his or her own interests. The wording of Article 21.1 is:

The data subject shall have the right to object, on grounds relating to his or her particular situation, at any time to processing of personal data concerning him or her which is based on point (e) or (f) of Article 6(1), including profiling based on those provisions. The controller shall no longer process the personal data unless the controller demonstrates compelling legitimate grounds for the processing which override the interests, rights and freedoms of the data subject or for the establishment, exercise or defence of legal claims.

(Note that it is presumed that the objection is expected to be on the basis of the “particular situation” of that individual, implying that any assessment of the balance of interests will only be applicable to that one person.)

The problem with this stage of the procedure is that the assertion of predominant legitimate interests by a controller against a whole body of data subjects is then only questioned in the circumstances of individual cases.

Furthermore, when the controller is called upon to “demonstrate” its compelling legitimate grounds that override the interests of the individual, there is no process defined for a potential independent assessment. Presumably the controller has to demonstrate its position to the data subject, but there is no requirement to inform the supervisory authority or anyone else — unless the data subject is unhappy and complains.

Nearly all organisations, particularly those with commercial interests, will take a decision on how to deal with the GDPR in terms of a balance of cost, risk and reward. This will not be the “risk” as generally covered by the GDPR, which is the risk to the rights and freedoms of individuals through misuse of their personal data, this will be the risk to the enterprise.

The costs, risks and rewards equation of using ‘consent’ looks bad (for commercial organisations with traditional business models) under the GDPR:

· Costs: Structuring and implementing a good consent procedure could be expensive.

· Risks: If any single consent is ruled invalid, this would probably apply to all consents under the same procedure and so processing of all the data subjects involved would have to stop immediately.

· Rewards: Since the procedure is ‘opt-in’ there will be a much lower take up by data subjects compared to ‘opt-out’.

The costs, risks and rewards equation of using ‘legitimate interests’ looks good, even if the balance of interests calculation is done unfairly biased towards the interests of the organisation:

· Costs: These may not be so high, since it is an internal exercise and the level of effort put into the balance of interests calculation can be kept low if there is little risk of having to justify it.

· Risks: Falling foul of the law is unlikely, since the methodology of the balance of interests calculation will probably never be tested. In response to any individual complaint, the controller can just accept the objection of that person and stop processing their data, so not having to justify the original logic and continuing to process the data of others without change. If there were so many complaints that it were to come to attention of the supervising authority, the organisation can simply defend itself on the basis of the many ‘judgemental’ calls that had to be made when calculating the balance of interests. If the organisation can show basic diligence by reference to an impact analysis conducted at the start, a significant fine is extremely unlikely.

· Rewards: Since this effectively turns ‘legitimate interests’ processing into an opt-out procedure, the organisation will be able to process the data of nearly all the people it wants, just reducing the numbers to the degree that it receives objections/opt-outs.

This loophole arises from the impossibility of defining precise rules to conduct a balance of interests assessment, combined with a procedure that theoretically puts the burden of proof on the controller but in practice leaves controllers almost unsupervised. The loophole does not apply in the case of processing sensitive data, since a controller’s legitimate interest is not a lawful basis to do this (excluding the special cases of healthcare and certain non-profit bodies). However, most processing of personal data does not include sensitive data.

One solution will be if there is a shift back towards the use of ‘consent’, but under the GDPR rules. For some businesses, this might occur due to the forthcoming ePrivacy regulation, see Consent: lost and found. Another would be a more generalised use of ‘contract’.

It appears that responsibility for minimising the effect of this loophole will fall to supervisory authorities. However, these authorities will be overwhelmed with more definitive responsibilities once the GDPR is applied and, in absence of public complaints, their duty to act on legitimate interest issues is somewhat nebulous. Probably the best that can be hoped is that opinions from the European Data Protection Board (that replaces the Article 29 Working Party next year), guideline documents from the supervisory authorities and codes of conduct from industry bodies (Article 40) will draw clear lines about how to apply the balance of interest calculations and reduce the margin of tolerance for controllers that rely on dubious legitimate interests claims.


This article has focused on five significant loopholes in the GDPR. Another article will describe weaknesses of the regulation that might undermine its success even without any conscious abuse.

However, this article comes with a health warning: it has not attempted to make a balanced judgement of the GDPR. Despite any imperfections, the GDPR is already having a major effect on all industries that make use of personal data — in nearly all cases giving more protection and more usable rights to individuals. Keeping the loopholes as small as possible will have a big impact on its overall success.


See also:

GDPR global scope: the long story

Consent: lost (GDPR) and found (ePrivacy)

GDPR: data portability is a false promise


[1] C-585/08 Peter Pammer and C-144/09 Alpenhof, judgment of 7 December 2010:

[2] The Article 29 Working Party uses the combined expression “inferred data and derived data”, see

[3]C-141/12, YS v Minister voor Immigratie, Integratie en Asiel and Minister voor Immigratie, Integratie en Asiel v M and S:

[4] The Article 29 Working Party uses the combined expression “inferred data and derived data”, see

[5] Article 29 Working Party, Opinion 06/2014 (WP 217):