Data: A New Direction — But Which Direction?

Alan Mitchell
Published in
10 min readDec 27, 2021

This is the fifth and final blog in our series about the UK Government’s proposals for data protection reform — “Data: A New Direction”. Previous blogs focused on the thinking behind the proposals. This blog summarises what the main proposals are.

Stated plainly, the UK Government is planning to end data protection rights for UK citizens. Reforms proposed in its paper Data: A New Direction would shift the core operating principle of data protection regulations from citizen protection (that personal data shall only be collected by organisations “for specified, explicit and legitimate purposes”) to a new principle that organisations should have the right to build and maintain databases about citizens without their consent.

This Briefing Paper shows how the Government is planning to achieve this radical ‘new direction’ for data. (Paragraphs 57 and 58 of the Consultation, around which this ‘New Direction’ pivots are reproduced in the Addendum.)


The Government is taking the opportunity of Brexit to ‘reform’ data protection law. “Now that we have left the EU, we have the freedom to create a bold new data regime,” says the Minister in his introduction. The stated intention of this “bold new data regime” is to “usher in a new golden age of growth and innovation right across the UK”. This, to be achieved by creating “​​a better balance between protecting individuals and not impeding responsible data use” [Paragraph 59] — a ‘better balance’ that ends citizen data protection rights in all but name, replacing them with corporate rights instead.

The Minister’s introduction states that “The protection of people’s personal data must be at the heart of our new regime. Without public trust, we risk missing out on the benefits a society powered by responsible data use has to offer.” But the content of the actual proposals do the opposite.

What the law currently says

The core principle of existing GDPR data protection regulations is that personal data shall only be “collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes”.

A key supporting principle is that of data minimisation: that personal data shall be “adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed”.

There are six conditions for the lawful processing of personal data but the two central ones are that:

  1. the data subject has given consent to the processing of his or her personal data for one or more specific purposes;
  2. processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;

Conditions for lawful processing envisage situations where “processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party”. But these legitimate interests come with a ‘balancing test’ to test whether they should be “overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data”.

What the UK Government is proposing

On the grounds of addressing ‘consent fatigue’, the Government is proposing to:

create a limited, exhaustive list of legitimate interests for which organisations can use personal data without applying the balancing test in order to give them more confidence to process personal data without unnecessary recourse to consent.” [Paragraph 57]

Paragraph 57 adds that “The processing would still have to be necessary for the stated purposes and proportionate.” But the ‘exhaustive’ list of exceptions to this rule provided in Paragraph 58 is so broad that organisations would have the right to process personal data without individuals’ consent in the vast majority of use cases. In other words, Paragraph 57 renders the safeguard meaningless, leaving it for window dressing only.

The pivotal clause is Paragraph 58 (i), which makes “Managing or maintaining a database to ensure that records of individuals are accurate and up to date, and to avoid unnecessary duplication” a ‘legitimate interest’ where organisations need not seek individuals’ consent for the processing of their data.

This deft wrecking amendment turns the current rule — that personal data should only be processed for specified, explicit purposes — on its head, promoting organisations’ right to collect data about individuals without their consent instead. No limit or restriction to organisations’ right to collect data for their databases is mentioned.

The rest of the consultation extends the exceptions to cover most uses to which organisations put data, including for research, business innovation, web analytics, AI training, use of cookies, and data sharing. For example, Paragraph 58 (i) is supplemented by Paragraph 58(h) which includes “Using personal data for internal research and development purposes, or business innovation purposes aimed at improving services for customers”. No definition for ‘internal research’ or ‘business innovation’ is provided, making them vague enough for Cambridge Analytica to claim that its activities were entirely lawful.

Paragraph 58(c) has another exception. “Monitoring, detecting or correcting bias in relation to developing AI systems” would now also be a ‘legitimate interest’ where individuals’ consent for data processing is no longer needed. This might seem innocent. Desirable even. But practically speaking, the best way to use data to eliminate the risk of bias is to have as comprehensive and complete a database as possible — which means this clause could (and will) be used by some corporations as good reason to collect all possible data about every individual: it opens the door to total data surveillance. (It is also based on a false premise: that the main cause of bias in AI deployments is lack of access to data. This is not true. The main cause of bias in AI is poor management of the processes for testing and deployment: a people thing, not a data thing.)

That this is the direction of travel intended by the Government is confirmed by other proposals which include unspecified measures that would “permit organisations to use personal data more freely, subject to appropriate safeguards, for the purpose of training and testing AI responsibly” [Paragraph 81] and to ‘disapply’ individuals’ current right to object to automated processing [Paragraph 48] .

The Government continues in the same direction with its proposals to reduce ‘consent fatigue’ as it relates to cookies. Paragraph 195 proposes to “permit organisations to use analytics cookies and similar technologies without the user’s consent.” (The term ‘analytics cookies’ is used in a variety of different ways without a single, clear definition.)

Paragraph 197 would “permit organisations to store information on, or collect information from, a user’s device without their consent for other limited purposes” where ‘other limited purposes’ “could include processing that is necessary for the legitimate interests.” (In other words, the free-for-all created by Paragraph 58).

Question 2.4.4 simply asks “To what extent do you agree that the requirement for prior consent should be removed for all types of cookies?” Again, a door opener to total data surveillance.

Meanwhile Paragraph 43(b) seeks to expand the grounds for lawful processing of personal data to include ‘research purposes’, with no stipulations on what may be included or not included in the definition of ‘research’.


The purpose of these ‘reforms’ seems to be to create a completely ‘free market’ for the trading of UK citizens’ data, without their consent. The Ministerial introduction talks of “secur[ing] the UK’s status as a global hub for the free and responsible flow of personal data” (Note the word ‘responsible’ again. See below).

To this end, Paragraph 51(g) (part of an extended discussion on lifting restrictions on the ‘further processing’ of personal data) notes that “Innovative data uses may involve sharing personal data sets between different controllers”. This opens the door to corporations trading peoples’ data without their knowledge and behind their backs. To this end, the Government intends to clarify “When personal data may be re-used by a different controller than the original controller who collected the data, and whether this constitutes further processing.” If this ‘clarification’ is based on the new definition of ‘legitimate interest’, it could make UK citizens’ data a globally traded commodity over which they have no say.

In sum, the net effect of the new regulations would be to turn data protection regulation on its head, effectively removing all main citizen rights and giving organisations carte blanche to collect and use personal data as they wish, without individuals’ consent, thereby opening the door to unrestricted data surveillance and value extraction. All in the name of ‘innovation and growth’.

The regulatory environment

As part of this ‘New Direction’ for data, the Government is also seeking to compromise the independence of the regulator — the Information Commissioner’s Office. Key elements of the extended discussion on this subject are the proposals [Paragraph 326] to “place a new duty on it [the ICO] to have regard for economic growth and innovation when discharging its functions”.

The absurdity of this concept becomes apparent if it is applied to other areas of regulatory enforcement. Should the Health and Safety Executive or Trading Standards Officers ‘have regard for economic growth and innovation when discharging their functions’, ‘balancing’ the requirements of health and safety and honesty in trading against the ‘needs’ for economic growth? What such rules and regulations do is create boundaries that channel innovation and economic growth in a certain direction: a direction that protects health and safety rather than undermines it.

Likewise with data protection. When it comes to data, the Government wants to compromise rules and regulations that channel innovation and economic growth in a direction that protects citizens’ data to take the nation in “a new direction” (the title of its Paper) — one that exploits citizens’ data instead. Should the ICO have held back on fines on Cambridge Analytica on the grounds that it was promoting innovation and growth?

This overt political interference in the enforcement of law is confirmed by [Paragraph 319] which introduces “a power for the Secretary of State for DCMS to prepare a statement of strategic priorities to inform how the ICO sets its own regulatory priorities”.

The nature of the consultation process

At 146 pages and over 50,000 words of dense, technical (and often obfuscatory) commentary, the consultation seems designed not to be read, the purpose being to hide its true intent, made possible by carefully chosen wrecking amendments that are hidden in a welter of often irrelevant detail. How many people have had the time or energy to read and inwardly digest the full document to grasp its implications?

One of the stated justifications for the proposals are that current regulations are “unnecessarily complex or vague” and continuing to “cause persistent uncertainty”. Yet the consultation itself is both unnecessarily complex and vague.

Its use of the word ‘responsible’ is a good example. The consultation highlights the difficulty of defining terms like ‘the public interest’ and has an extended discussion of the meaning of ‘fair processing’, concluding that “Comprehensively defining ‘outcome fairness’ in the context of AI within the data protection regime may [therefore] not be a feasible or effective approach”. But with the word ‘responsible’ it introduces a term new to data protection regulations, using it 52 times … without ever defining it.

Rationale for reform

The Government’s main justification for these ‘reforms’ (other than to rectify ‘confusion and vagueness’) is to address ‘consent fatigue’ and “unleash data’s power across the economy and society for the benefit of British citizens and British businesses” thereby “ushering in a new golden age of growth and innovation right across the UK”.

Neither of these justifications stand up to scrutiny.

‘Consent fatigue’ is mainly caused by the widespread gaming of consent systems, compounded by lax regulator oversight. The problem can be better addressed without any changes to the law, as we show here.

The ‘innovation and growth’ envisioned by the Government in this Consultation represents a deep misunderstanding of what makes data valuable; misconceptions about where the biggest opportunities for innovation lie and how to enable them; and a fundamental misunderstanding of the nature and potential of artificial intelligence.

In short, because it is intellectually incoherent and flawed, it will not achieve its stated goal: to “unleash data’s power across the economy and society for the benefit of British citizens and British businesses”. In fact, it is almost certain to do the exact opposite.

This blog series

Other blogs in this series are:

Addendum: Paragraphs 57 and 58 in full

57. The government therefore proposes to create a limited, exhaustive list of legitimate interests for which organisations can use personal data without applying the balancing test in order to give them more confidence to process personal data without unnecessary recourse to consent. The processing would still have to be necessary for the stated purposes and proportionate. For those activities not on the list, the balancing test would still be applied. The balancing test could also be maintained for use of children’s data, irrespective of whether the data was being processed in connection with an activity on the list. The government is mindful that Article 6(1)(f) of the UK GDPR recognises that particular care should be taken when data controllers are relying on the legitimate interests lawful ground to process data relating to children.

58. Any list would also need to be sufficiently generic to withstand the test of time, although the government envisages it could be updated via a regulation-making power. In that respect, the list would be similar to the approach in Section 8 of the Data Protection Act 2018 for the public tasks processing condition. For example, it could cover processing activities which are necessary for:

  1. Reporting of criminal acts or safeguarding concerns to appropriate authorities
  2. Delivering statutory public communications and public health and safety messages by non-public bodies
  3. Monitoring, detecting or correcting bias in relation to developing AI systems (see section 1.5 for further details)
  4. Using audience measurement cookies or similar technologies to improve web pages that are frequently visited by service users
  5. Improving or reviewing an organisation’s system or network security
  6. Improving the safety of a product or service that the organisation provides or delivers
  7. De-identifying personal data through pseudonymisation or anonymisation to to improve data security
  8. Using personal data for internal research and development purposes, or business innovation purposes aimed at improving services for customers
  9. Managing or maintaining a database to ensure that records of individuals are accurate and up to date, and to avoid unnecessary duplication