Response to draft Guidance on Age-Verification Arrangements and draft Guidance on Ancillary Service Providers


(1) My name is Alec Muffett. I am submitting this response as an entirely personal effort, however for context:

- I am a recognised authority on internet security, and have worked in and around the computer & network security industry for about 30 years.

- In previous employment, I was Chief Architect for Security for Sun Microsystems Professional Services in EMEA, designing systems for deployment in investment banks, telcos, internet service providers, stock-exchange clearance houses, etc.

- In previous employment I was a software engineer for Facebook’s “Security Infrastructure” team, leading several major projects.

- I am a member of the board of directors of the Open Rights Group.

- I am a member of the newly-established “Security & Privacy Executive” of the British Computer Society.

(2) I find many causes for concern in the drafts regarding age-verification arrangements and ancillary service providers, however I shall limit this response to a solitary, most pressing, issue.

(3) I am deeply concerned by the lack of regulatory oversight, and the lack of standards regarding the operational and functional aspects of data and information security, which speaks to this consultation via:

“It also includes information about the requirements that age-verification services and online pornography providers must adhere to under data protection legislation and the role and functions of the Information Commissioner’s Office (ICO).”

(4) I aver that these dual lacks risk irreparable damage to the privacy of millions of Britons. In reverse order:

SECTION: The lack of standards regarding the operational and functional aspects of data and information security

(5) Data pertaining to “an individual’s sex life or sexual orientation” is clearly of a highly sensitive nature, evidenced by the fact that it is a category of data which is (several times) called-out for special treatment in the upcoming Data Protection Act 2018.

(6) Further, we are aware from “Mosley v News Group Newspapers Ltd” that matters of privacy related to sexuality are hard, perhaps impossible to adequately redress.

(7) Moreover we are aware that a prurient market for such information exists; again the Mosley case, plus also the history and behaviour of British tabloid journalism; not to mention matters of criminality, blackmail and extortion.

(8) However the only standard which speaks to the protection of age verification data is “BSI PAS 1296” which is a general-purpose document that can be characterised as “how to age verify” (with related concerns) for all businesses from online penknife sales to hardcore pornography.

(9) It should be obvious that there is a difference in the sensitivity of data between “John Doe purchased a Swiss Army Knife from Amazon”, versus “Jane Doe visited”

(10) The fact that “Jane” visited “” suggests that Jane may be a Lesbian, and thus is information that pertains to “an individual’s sex life or sexual orientation”

(11) Again, per “Mosley”, if this information leaks, then redress is hard and expensive; published information cannot easily be “put back into the bottle”; this distinguishes such information from (say) credit card data, where in the case of “identity theft” the banks (etc) underwrite losses and will recompense a victim of credit-card data theft.

(12) Yet the entire architecture of “age verification” is to create large, centralised, attractive-to-hack repositories of personal information that “Jane Doe” sought age-verification for “”; and these repositories are expected to “sort out” their own “homebrew” operational and functional data protection standards via “laissez faire” regulation.

(13) Thus, to summarise this section:

- regulation is creating a few, large, centralised, attractive-to-hack repositories

- of personal information which pertains to “an individual’s sex life or sexual orientation”

- yet we are proposing that such data is adequately “protected” by a general-purpose document that describes how to perform age verification for purposes including such as online penknife sales

- and which in turn punts data protection requirements to GDPR / the upcoming Data Protection Act,

- in the misconceived expectation that GDPR and the Data Protection Act provide operational and functional standards

- when in fact they provide only yet more regulatory requirements

- leading, inevitably, to diverse “homebrew” security implementations,

- breach of which will lead to bulk leakage of sensitive data which is hard to redress, per Mosley.

(14) This does not appear to offer proportionate protection for this character of data, especially at the scale of millions of Britons in a handful of weakly-regulated, “homebrew”-secured, databases; we are thereby setting the stage for another “Ashley Madison”-like data breach, which in that case led to the suicide of several people because of the nature and sensitivity of the information leaked.

SECTION: The lack of regulatory oversight

(15) Above, I mention credit-card payments for comparison; payment card information is not called out for special processing under the upcoming Data Protection Act, instead it is merely expected to be treated “normally”.

(16) Further: data breaches surrounding payment card information are redressable; compensation can be paid, well-established mechanisms and processes exist to support recompense, even in the instance that one of the entities should go bankrupt.

(17) So: payment card information is of a considerably less “existential” nature than pornography-site age verification data, however it is protected by a considerably better operational and functional standard which must be adhered to in order that an operator or vendor/commercial customer of payment card services can operate: the “Payment Card Industry Data Security Standard”, or PCI-DSS.

(18) The PCI-DSS is a comprehensive suite that defines concrete requirements for protective technologies (firewalls, encryption), access controls (passwords, more encryption), sensitivity levels (which aspects of card data are most secret, as opposed to visible “on-screen”), operations, and screening of personnel (background checks)

(19) PCI-DSS also defines what portion of payment-card data (if any) is visible to the vendor who is selling to a customer.

(20) “BSI PAS 1296” covers none of this; again, its primary focus is upon the process of age-checking (and, eg:, assuring that the customer cannot bypass an age check) rather than to protect the _fact_ of age checking.

(21) Further: there is no mention of performing criminal records (CRB) checks on staff, nor of checking whether ones’ new employee might previously have worked at some Sunday tabloid.

(22) There is no definition of “adequacy” for protection of different aspects of age-verification data (viz: “Jane Doe”, her address, or which of several websites she has age-verified with).

(23) In short: PAS 1296 is wholly insufficient for the purpose of defining protection of “sensitive age-verification data”.

(24) There are many steps that would be necessary to address this tremendous gap:

- a regulator will need to define operational and functional security standards (akin to PCI-DSS) for provision of “sensitive age-verification data services”,

- it will need to be able to regularly audit and shut down non-compliant providers of (and, per PCI-DSS, larger customers of) sensitive age-verification data services.

- liabilities will need to be assigned, sensitivity of differing classes of information will need to be defined.

- a means of redress/compensation would need to be defined for people who have had data leaked in the instance that an age verification provider is bankrupted by GDPR fines, etc, subsequent to a breach.

- other…

(25) I would appreciate the BBFC’s addressing this matter, raising it with the Government that existing regulations are insufficient, and the associated standards are not yet fit for purpose, to permit us to move forward with deployment of age verification.