Why we need a ‘PCI’ for Porn Metadata

Alec Muffett
9 min readAug 5, 2018

--

“Classy…”

Retrospective…

Growing up a teenager in the 1980s, the hardcore “Porn” of that era was fairly easy to obtain: the upper-sixth-formers (17+) who looked old-enough would buy a variety of embarrassing and decontextualised magazines with implausible names like “Razzle”, and then they would be viewed, shared-around, and bartered-down to the lower-sixth students for favours, the practical subtext of most of which involved “avoiding being bullied by being ‘friends’ with the older kids”.

This is not to say that the fifth-formers (Year 11, age 15–16) were entirely without other forms of access to porn-in-general; this was the era when all tabloid newspapers were legally obliged to print Samantha Fox’s breasts on page 3 or 7 at least once per week; not to mention that the aforementioned glossy magazines would eventually — half-wrecked — be found by students from the age 10+, either behind the bikesheds or else blown into hedgerows and under wheeliebins where the “porn fairy” had left them.

And the porn industry itself must have been quite respectable at the time, surely? Not least because some of the people working in it went on to become political advisers.

The Present Day…

The above is life as I remember it from that era; yet the children of today are apparently at greater risk of “stumbling across pornography” than at any time in the past. My personal opinion is that if the children of today are still anything like the children of the 1980s, they aren’t stumbling so much as attempting to bodily throw themselves into this arcane pool of adulthood, and it would be better for them to be properly advised by teachers in how to swim, rather than left to their own devices to drown in ignorance of real relationships & consent.

However: the Government (and various pressure groups) disagree; it doesn’t matter if you are Tory or Labour, there is something to object to in specific about pornography, and therefore by extension to seek to restrict all of it; it doesn’t matter if you are an independent genderqueer feminist entrepreneur who produces boutique spanking porn, you are clearly either (Tory) bent upon corrupting youth, or (Labour) oppressing and exploiting yourself because of patriarchy, and (Both) your ongoing livelihood is less important than stopping some hypothetical children from “stumbling” across your site.

Which brings us to the Digital Economy Bill (qv)

https://medium.com/@alecmuffett/on-the-digital-economy-bill-1df356862ac2

How it worked until this year…

The baseline of using any commercial website on the internet is straightforward: register for an account, and it doesn’t much matter if you choose a username like KinkajousAreSexy or XenaWarierPrinces, so long as the payment details are (a) valid, (b) legitimate, and (c) securely handled, then nobody really cares who you “are”, nor what your “identity” is, nor whether you have proven yourself to be mature- (or at least old-) enough to purchase something.

To a first approximation, most websites deal with requirements a/b/c by “outsourcing”, because payment cards and the Payment Card Industry (PCI) are extremely serious about doing security properly; to this end there is a Data Security Standard (PCI-DSS) which is industry-mandated for all organisations which handle stuff like credit cards.

Sidebar: Pros/Cons of Outsourced PCI-DSS-compliant Providers

Vendor Downside: you must pay a fee for payments, but you probably have to do that anyway for credit-card processing.

Vendor Upside: you don’t really have to know anything about the user named “XenaWarierPrinces”, plus there are additional anti-fraud and other benefits offered by the payment provider, so you have a safety net.

Customer Upside: there are lots and lots of online vendors, but relatively few payment processors who are capable of jumping through the PCI-DSS security hoops; therefore a) the payment experiences are fairly consistent, b) the DSS requirement means that your data is held securely, and c) because payments are outsourced to competent, DSS-compliant third payment providers, your personal & payment details are reasonably safe if the vendor website gets hacked.

The PCI-DSS is a VERY thick document (which is absolutely free to anyone who wants to read/download it) that defines the practical and technical aspects of protecting your credit card information and metadata — the big number on the front, the little number on the back, the address where you live, and all the details of your transactions.

To make a long story much shorter: being compliant with all of these protections is long and complex, and requires regular auditing by expensive people, so most websites outsource the compliance, protection, and handling of payment card data to specialists like Stripe or WorldPay.

How it’s supposed to work, now.

However: children “stumble across porn” and the Government wants this to be stopped; the result is perverse in a way that has nothing to do with porn.

In order to view pornographic content you must soon “prove” to somebody that you are old enough to be permitted to do so; this ends the effective pseudonymity of “XenaWarierPrinces” and instead requires that she validate herself as “Mrs Jane Trellis, of North Wales, age 47” so that the “vendor” porn website —which hitherto may not have had any concept of “logging-in”— will eventually be certain that no British child is ever going to stumble across their content.

The Government’s approach was to throw the detailed implementational problems of “Age Verification” to industry to be sorted out; the Government sought to foster[1] a new service industry of “age-verification providers” — trusted authorities to whom you could prove your identity once and then view as much porn as you like while that identity tracks your horny perusal of the internet’s erotic bounty.

And all of this would be “safe” because of two documents — known respectively as PAS1296 and GDPR — which were supposed to provide everything necessary for this nascent age verification industry to be secure.

Except that PAS1296 was technically deficient, and GDPR — in its British incarnation as the Data Protection Act — says something rather more important about this topic…

What the GDPR has to say about Sexuality…

Let’s go look at the Data Protection Act Part 3, sections 35 & 42 — and note that the precise phrasing is:

“sensitive processing” means … the processing of data concerning an individual’s sex life or sexual orientation.

With regard to data that requires “sensitive processing”, the Data Protection Act does not limit itself to metadata that merely describes someone’s sex life or orientation — names of partners, activities, labels like straight, gay, trans — but in fact the act captures any and all information that simply “concerns” an individual’s sex life or orientation.

I cannot imagine any viable argument that an individual’s porn-browsing habits are somehow notdata concerning [that] individual’s sex life” — even in the circumstance that the person’s sex life might be a purely solitary pursuit.

Porn browsing, individually or as a joint endeavour, clearly is part of someone’s sex life; and is therefore deserving of “sensitive processing” under the DPA or GDPR.

Let us therefore examine the burden of “sensitive processing”; for example s42 basically reinforces s35(4–5) and 42(2) says:

The controller [ie: AGE VERIFICATION PROVIDER] has an appropriate policy document in place in relation to the sensitive processing if the controller has produced a document which: (a) explains the controller’s procedures for securing compliance with the data protection principles (see section 34(1)) in connection with sensitive processing in reliance on the consent of the data subject or (as the case may be) in reliance on the condition in question, and (b) explains the controller’s policies as regards the retention and erasure of personal data processed in reliance on the consent of the data subject or (as the case may be) in reliance on the condition in question, giving an indication of how long such personal data is likely to be retained.

Now: the PAS1296 document was drafted by industry players to address the “how to” questions of age verification; it largely deals with the rationale of age verification, and offers a little perspective about how to keep vendors (ie: porn sites) separate from the user data, but it says nothing concrete about what age verification providers need to do in order to adequately protect the information that they hold.

One could suggest that PAS1296 is the “document in place in relation to the sensitive processing” — except it’s clearly not; for instance, none of the following words are in PAS1296: erase, erasure, destroy, delete; the word “deletion” is only referred to in respect of credential management, and “deleting” appears only in a questionnaire on page 45:

What attribute lifecycle management processes for maintaining, creating, updating and deleting attributes are in place? — p.45

Perhaps the point of PAS1296 was to drive creation of such an “appropriate policy document”? Possibly, but information security should not be treated as a “choose your own adventure” book — and relegating “sensitive processing” to a “did you remember to do this?” questionnaire is emphatically not tenable.

By comparison? PCI-DSS covers all these issues (and more!) at extraordinary length and depth, yet processing payments data is NOT EVEN “sensitive processing” under the terms of the Data Protection Act!

Let’s be as clear as possible on the differences:

  • the payments industry understood the data-protection landmine that they were sitting on top of, and they therefore took reasonable and proactive precautions in developing PCI-DSS for protection of payment card data.
  • the authors of the age-verification requirements in the Digital Economy Act appear focused upon political outcomes of child protection, and have sought to create an unregulated “Age Verification” industry whilst deliberately ignoring the issues of technical data protection for which (not least) the Data Protection Act demands “sensitive processing”.

The Government has:

  • walked into the process of people browsing porn,
  • in order to protect children, demanded that the consumers in future identify themselves with strong identity checks,
  • sought to foster “age verification providers” to keep the data which supports those strong identity checks, away from the porn sites
  • completely ignored the matters of regulating “age verification providers”, their habits, ownership, business behaviours, and the technical data protections and controls which they employ to keep their data secure

Ergo, if age-verification “transaction data” is to be stored — and it will need to be stored, not least to validate age-verification billing — we will need a PCI-DSS equivalent (AVDSS?) to define the technical data protections and standards for the AV data lifecycle — creation, storage, deletion, hosting, use and resale of data, operational protection, frequency of audit & compliance, liability/measures for failure to comply, etc.

This is not only required in order to protect the data adequately — after all, the age verification provider will have a list of each and every porn site you will have accessed[2] — but also to provide administrative “pressure” to assure data minimisation amongst all parties, including the porn sites; see the ICO principles:

https://ico.org.uk/for-organisations/guide-to-the-general-data-protection-regulation-gdpr/principles/data-minimisation/

…and compare that online vendors are only permitted to retain tiny amounts of credit-card data, perhaps the last 4 digits, or even merely the card expiry date; and the “tokens” which link them back to the PCI provider are not reused between sites / not linkable, and are frequently rotated.

There is a lot of work to be done in this space, and it is not work that can be deferred until something bad happens.

Summary

  • If you visit a porn / adult-content website, the fact that you visited that website is data concerning your sexuality and therefore it requires “sensitive processing” under the DPA/GDPR
  • The current plans for upcoming “age verification providers” are not commensurate with the requirements for “sensitive processing”
  • The payment-card industry, in PCI-DSS, offers a strong and open model for defining the technical data protection measures (incl: ongoing audits) to adequately protect such sensitive information
  • “Personal information leaks” are not like “money leaks” — they cannot be underwritten nor reversed[3]; so it is foolhardy to launch an industry of “age verification providers” without properly addressing these risks.

Footnotes

  • [1] Historically the UK Government has attempted repeatedly to bootstrap “identity as a service”, mostly to solve their own challenges, but perhaps also as a sop to the community who wish for a “digital identity card”.
  • [2] not only for billing but to assure ongoing proofs of age
  • [3] for instance: https://en.wikipedia.org/wiki/Mosley_v_News_Group_Newspapers_Ltd

--

--