How do you get a good security standard? Really, not like this:

Alec Muffett
12 min readDec 9, 2018

--

1. Challenge

Do me a favour: take out a credit card or debit card and look at it — front and back — because it will help with the discussion to come.

Consider what you know about such “payment cards”. You yourself probably know quite a lot about how you should treat the data that is printed or stamped upon such a payment card:

  • There’s the big number on the front (the “PAN”) and if someone is ever talking to you (or emailing you) about your credit card, they should probably only ever quote the last 4 digits at you. The PAN is typically embossed on the card for long life, and so that it can still be used with stone-age carbon-paper receipts. You may also know that the first 6 digits tend to refer to the issuing bank.
  • There’s the Expiry date, which we know is a bit sensitive but still embossed, because it’s used to validate the card and is part of the transaction information.
  • There’s the Cardholder Name
  • On the magnetic strip will be a Service Code and PIN-related information
  • There may or may not be: start dates, issue numbers, sort codes, account numbers, bank logos, holograms, chip contact pads, etc; the fact that they vary / may not exist, tells us that they are optional…
  • …and then, on the reverse, is the CVV code* which is very important, which is very secret (not embossed) and which you must never tell to anyone whom you are not transacting with, and for which there are very, very strict rules regarding who is permitted to store that information on-disk, as opposed to carefully passing it along to the bank in question.
  • *You get a bonus point if you know that it’s not correctly called the “CVV” but instead one of several other acronyms beginning with “C”; deduct one point if you simply call it “the number on the back”.

You know all of this partly from familiarity, but also because of the nature of payment card: it’s well-defined, all of the different elements have formal names, and clever people who have studied payment security for decades have performed “risk analyses” of the different elements, and have published, at extraordinary length, rules regarding what a merchant (for instance: Amazon, Tesco, Google, PornHub)’s website may-or-may-not do with the different elements of data:

A heavily marked-up excerpt of the overview of the necessarily anal-retentive PCI-DSS standard

All of these rules are defined in the Payment Card Industry Data Security Standard, ongoing development of which is a long-term process drawing upon the insights and intentions of a huge multi-stakeholder community, from card issuers and banks to merchants large and small, through website engineers, developers and integrators, to (of course) the users — the people who pay and who expect their money and data to neither leak nor be lost.

2. However…

In my experience[1] there has never been a decent security standard created in vacuo; it simply does not suffice to gather a bunch of “experts” and have them birth some documents which will address the protection needs and use cases of some broad topic.

Note that this is quite different to (say) creating a corporate security policy where the only stakeholder is one’s own company — but even then it is highly desirable to seek third-party perspectives regards what issues and eventualities you have not considered.

However: from rumour I gather that the above is precisely how the BBFC, the British Board of Film Classification, are approaching the four-fold matters of:

  • data management and protection (e.g.: from rogue employees)
  • data storage, protection & deletion (e.g.: on harddisk or in-cloud)
  • host security (e.g.: configuring & hacker-proofing your servers)
  • network security (e.g.: preventing your data being accessed unwarrantedly, over the internet)

Rumour has it that the BBFC are throwing these issues over the wall to “a large consultancy” who will produce some well-thought-out documentation which will purportedly address all of the risks in the above 4 (more?) security spaces.

3. “Well, that’s no ordinary rabbit!”

It wouldn’t be so bad if we had not already been here before; “Age Verification” (AV) was envisioned as a mechanism to protect some percentage of British children[2] from “stumbling across” pornography, online.

The AV opportunity was seized (as ever) by the usual suspects seeking to monetise yet-another Government-mandated requirement for digital identity[3] — abetted this time by porn-industry giants who will profit from competitive analytics (e.g. “which indie-porn sites are getting most traffic, shall we buy or compete with them?”) and other forms of competitive / anticompetitive industry leverage.

In order to address “security” the aforementioned gathered together under the auspices of the British Standards Institute and drafted “PAS1296” — a “Publicly Accessible Standard” for anyone who is willing to pay £90, yet somehow the whole of PCI-DSS is free to download — and (long story short) it was not remotely fit for purpose.

I popularly excoriated and criticised an early draft of PAS1296 in a long Twitter thread, and I stand by those criticisms today: many unaddressed issues remain unaddressed, and in fact more existential and risks have since come to mine & the Open Rights Group’s attention:

Since the initial arguments over PAS1296 the UK Government has passed the Data Protection Act (DPA) which guarantees sensitive “…processing of data concerning an individual’s sex life or sexual orientation” — §86.7e — and yet apparently nobody wishes to consider that if a person regularly age-verifies in order to access “ireallylikegayporn.com”, the resulting metadata trail will clearly constitute “data concerning [their] sex life or sexual orientation”.

This issue is literally and clearly written verbatim into the Data Protection Act; and if — as industry and regulator —we are ignoring observations as obvious as this, we must surely be in some kind of rush…

4. “Brave Sir Robin ran away, he bravely ran away away…”

https://publications.parliament.uk/pa/jt201719/jtselect/jtstatin/240/240.pdf p.5

Apparently “Age Verification” — which barely existed until the Digital Economy Act shoehorned it into existence — is a “fast moving” industry; in fact it’s moving so fast that it is ignoring key questions like the above, in pursuit of launching on a tight schedule.

Let’s follow a line of logic:

  • The BBFC say that they are going to certify AV providers and systems, and they agree with DCMS that they will launch this by April 2019
  • AV providers would have to be ready to proceed to launch by ~ March
  • Which leaves (at best) January/February to codify a security standard.
  • Which is so rushed as to preclude meaningful, open & public consultation.
  • Which (per section 2) will inevitably result in an unsatisfactory security “standard” that is not fit for purpose other than to insulate AV providers from valid or legal criticism…
  • …risking (per section 3) everyone’s data in order to satisfy a “tickbox” security requirement, ignoring fundamental security issues (up to and including “compliance with the Data Protection Act”) in the process.
  • aside: the standard will have to regulate all of: AV providers, AV systems, and (presumably) website operators, just to add to the complexity of the standard which is meant to be produced in this time.

All of the regulatory focus (Joint Committee on Statutory Instruments, Thirty-ninth Report of Session 2017, pages 12..16) is upon the questions of ensuring that some adequate quality age-verification is performed, and/or upon various sanctions against websites which fail to age-verify[4]

But regards concrete data protection, what do we find? From the Government, in print, we find this:

https://publications.parliament.uk/pa/jt201719/jtselect/jtstatin/240/240.pdf p.14

…a series of platitudes that implicitly punt the responsibility away from the BBFC and over towards the ICO / Information Commissioner’s Office, with a nod towards making sure that the AV providers must be made aware that Data Protection is a serious business.

This is reinforced in the BBFC’s own, recent, guidance (October, page 3):

This guidance also includes the role and function of the Information Commissioner’s Office (ICO), the UK’s independent body set up to uphold information rights, and the requirements that age-verification solutions and online pornography providers must adhere to under data protection legislation, which is enforced by the ICO. As set out in 3.6 of the Secretary of State’s Guidance to the Regulator, the role of the BBFC is to focus on the ability of arrangements to verify whether someone is 18 or over. The BBFC will not duplicate the role of the ICO, and there is a memorandum of understanding establishing a framework for co-operation and information sharing.

I have no adequate words to describe the abrogation of responsibility in this stance; it reflects a consummate lack of regard towards the data which the Government has demanded must be collected, by the Government (and their appointed regulator) which has demanded it.

5. “We are the Knights Who Say… Ni!”

In October the BBFC themselves produced “Guidance on Age-verification Arrangements” which contains this:

Outside of the statutory regime for age-verification and in order to encourage good practice, the BBFC is developing a voluntary, non-statutory certification scheme for age-verification solutions in consultation with the ICO. This scheme will incorporate a third party assessment of the data security standards within any age-verification solution which seeks certification under the scheme. Only those age-verification solutions which pass the scheme’s standards, as audited by the third party, will receive certification by the Age-verification Regulator. An outline of the certification scheme appears in Annex 5.

The fact that there is an AV-provider certification scheme at all suggests that the BBFC have twigged that the technical aspects of data protection need to be taken seriously.

SPECULATION: I am not certain of interpreting them correctly, however “a voluntary, non-statutory scheme to be established by the regulator under which age-verification providers may be independently audited by a third party and then certified by the regulator” sounds to my ears awfully like “perhaps this would go faster if we outsource the problem?

It certainly would appeal to companies who seek to profit off of AV — and therefore: appeal to their regulator who would thus be satisfying their mission — to kick all of this “security” stuff into the long grass and instead submit themselves to private audit by some big consultancy (PwKYDTG) who signs-off their architecture with “looks good to us!”, takes the money and runs away.

Even if not outsourced, the tragedy of this ersatz approach to “compliance” would be that we are no longer in a world of consistent and transparent data protection — again the Payment Card Industry stands as a yardstick: the PCI-DSS standards documents are free to even the general public for review, they are regularly refined, they define a clear taxonomy of the data which is collected, they define how it may be used, must be stored and how it should be disposed of. There is a clearly-described bar, and merchants and payment providers must jump over it, as adjudicated by someone known as a Qualified Security Assessor (QSA) — a trained, examined and certified job role whose work is audited.

Yet with Age Verification data security mechanisms as-proposed:

  • we have no taxonomy
  • lacking a taxonomy, we have no categorisation of the sensitivity of different elements of the AV metadata trail, and therefore no regulation of how the different elements must be handled
  • beyond the general “for the defined purpose” constraints of GDPR, we have no restrictions upon what information may be collected
  • beyond the general “for the defined purpose” constraints of GDPR, we have no restrictions upon how the metadata may be used, reused, or resold, eg: resellable analytics regards website popularity, cookie tracking, or “custom audience”-like lists, etc.
  • all of which link back to “…data concerning an individual’s sex life or sexual orientation” — requiring sensitive processing under the Data Protection Act.
  • We have no process for iterating implementation, learning, and remediation of the standard.

With such an ad-hoc “review by a hopefully-clever person” scheme we would be taking sensitive data re: the porn-browsing habits of a nation of tabloid-readers, “protecting” them by having a poor script-following college intern (via a large consultancy?) ask generic questions of the AV provider’s geeks, for which service the consultancy will be paid irrespective of outcome.[5]

Further: all of this AV “information protection” is being authored and decided by people who are divorced from the risk, who do not have “skin in the game” of data protection other than to enable the “fast-moving” AV business. By comparison the PCI DSS standards were written by the organisations (banks, cards) which literally bore the losses of fraud, but the person who loses-out from a leak of AV data concerning sex life or sexual orientation is/are the members of the general public — from call-centre workers to Members of Parliament — potentially en-masse and at scales of millions of people.[6]

One needs only to look to the Commons CMS committee to see Parliamentarians who are extremely exercised regarding the technical details of protection of vast amounts of sensitive data concerning millions of Britons. Where in the development of Age Verification information security is the equal introspection regarding the technical details of protection for age-verification data, delving into the proportionality and protection measures of each datum stored?[6]

6. “All right, sonny. That’s enough. Just pack that in.”

This is a long essay, so I’ll just try to summarise:

“Age Verification” has enough problems in that it is founded upon a misapprehension of how the internet actually works in practice; laying that aside, the planned (forced?) implementation timeline is:

  • unrealistic
  • harmful for data protection
  • blithe regarding data protection, and
  • being worked on by vested interests in camera and pursuing solution architectures which will make it hard to eventually undo the mistakes that they are inevitably making now.

A public consultation, or at least transparent and open engagement with a broad selection of stakeholders and interested parties, resulting in measurable public standards, would go a long way towards inoculating us against leaks of data concerning the sex life or sexual orientation of members of the British public.

I’ll finish with a highlighted extract from the BBFC’s recent guidance; notice that there are no definitions of terms such as:

  • “protected”
  • “securely”
  • “necessary”
  • “appropriate” / “appropriately”
  • [show they have] “considered”

…and also that §4.3c is phrased as “the need to process the minimum personal data necessary” rather than “the need to process no more than the minimum personal data necessary”, which suggests sloppy drafting (slightly saved by back-referencing GDPR in §4.5).

This kind of thing has been my job for three decades; this is not a measurable standard, not complete or adequate, and not fit for purpose; not to mention that the whole regime is “voluntary”? Might it be ignored completely?

7. Footnotes

[1] Alec Muffett has worked in the field of information security for 30 years; implemented the first modern password cracker, built firewalls and protected systems and networks at Sun Microsystems, was Principal Engineer & Chief Architect for Security in EMEA for both Sun Microsystems Professional Services & Financial Services divisions, designed and built security systems for investment banks, telcos, ISPs, stock-clearance houses and grid/cloud computing datacentres. Worked for Surevine as head of security and put that company through ISO27001. Worked for Facebook as a software engineer and built various resources including the Facebook Tor/Onion site, and was architect & lead engineer for adding end-to-end encryption into Facebook Messenger. He is a member of the board of directors of the Open Rights Group, a member of the Security & Privacy Executive of the British Computer Society, a Principal Engineer at Deliveroo, and frequently contributes to open-source security projects as a hobby. He is also on Twitter as @ alecmuffett; he speaks entirely for himself in this matter. More information with bio/headshot are available at this website.

[2] I have been told that the authors of Age Verification hoped and intended to set up a “global” standard; this does not bode well for the authors’ understanding of how the internet is organised.

[3] See also: “digital identity cards”, “digital government identifiers”, “combatting online benefits cheats”, etc…

[4] Including, it is proposed by the Government:

  • blocking social media accounts of non-compliant porn sites
  • forcing search engines to hide links to non-compliant porn sites
  • punishing/blocking ISPs of non-compliant porn sites
  • blocking advertising by/of/upon non-compliant porn sites

…per this document, page 15:

https://publications.parliament.uk/pa/jt201719/jtselect/jtstatin/240/240.pdf

…although what actual sanctions are enforceable is still to be decided.

[5] This has been typical of my experience in similar compliance spaces

[6] Added for clarity, 2018–12–10

--

--