On the Digital Economy Bill

Alec Muffett
21 min readOct 28, 2016

Evidence as-submitted to the Digital Economy Bill Committee; since that time at least one of the proposed technical solutions — the social-media-based one — has reduced what it requests from users for the purposes of authentication; however amongst all the other methods and their concerns the overall proposition of “social media” age-verification still has many issues, including but not limited to:

  1. matching the policies of (e.g.:) Facebook and other social-media sites
  2. user-profiling through their friend-relationships
  3. adequate protection of these new AV databases against hackers
  4. unclear restrictions upon future / 3rd-party use that may be made of data that was collected for AV purposes

Evidence follows…

Summary

(1) For proposed “Age Verification” (AV) techniques, this document provides observations and recommendations regarding:

  • AV techniques that are backed by payment card checks
  • AV techniques that are backed by social media information
  • AV techniques which build databases of porn usage
  • Operational security for protection of AV data
  • The correctness of AV determination
  • The impact of AV upon pornography SMEs
  • The responsibilities of the regulator, and the need to regulate “AV Providers”
  • Digital Economy Bill Age Verification provisions as a whole
  • ANNEX: proposed implementations, false-positive rates & PAYG devices

(2) …including heretofore apparently undiscussed risks, and many matters which have not yet been addressed in the deliberations. Much of this evidence was first published as a post on my blog at https://medium.com/@alecmuffett — and I have been widely encouraged to rework and improve it (with great haste) to submit as evidence, and so I apologise to the committee for length and any oddity caused by this rapid rework.

Author

(3) I am an independent software engineer and network engineer, specialising in security, privacy, integrity and cryptography for more than 25 years.

(4) In this time I have worked directly or indirectly for corporate network security teams, hardware & software vendors, built systems for ISPs, telcos, retail & investment banks, worked for government & defence “cybersecurity” systems integrators, and most recently have worked for 3 years as a software engineer for a leading social network.

(5) For several years I have also been a member of the Board of Directors of the Open Rights Group, a UK digital-rights campaigning and activist organisation, acting primarily in a “technical advisor” capacity.

Content

(6) My submission to the committee focuses on the Digital Economy Bill and its proposals for Age Verification (AV), and the implementation and regulation thereof.

(7) Recently, with friends and colleagues from Open Rights Group, I spent several hours at the Adult Provider Network’s “Age Verification Demonstration” to learn about technologies that claim to fulfil the proposed the draft bill’s AV requirements.

(8) To date there appears to have been no critical appraisal of proposed AV technologies. I have a strongly technology-based perspective and wish to outline my impressions of:

  1. The “implementability” of AV
  2. The technical “themes” of the various proposed techniques for AV
  3. A look at each of, and potential issues with, the ten AV techniques as-presented at the demo; this detail is provided in the annex to this submission

The “implementability” of AV

(9) There are perhaps five major challenges for AV implementation with respect to its goals:

(10) Ranum’s Law: named after security expert Marcus Ranum, the eponymous law states that “you cannot solve social problems with software” and requires one to consider frankly whether available technological techniques are fit for one’s goals, and if not whether one should redraw one’s goals.

(11) Accuracy of Age Determinations: There is a general principle in computing that “Garbage In = Garbage Out”. Now consider that all of the age verification techniques actually leverage some kind of substitute or “proxy” that “proves” the age of the porn-viewer. Where the proxy (“age of phone-user”, “habits of Facebook-user”, …) comprise malleable or bogus data, the subsequent AV decision will also be bogus. A key matter that is left unaddressed in the Digital Economy Bill is: who carries liability for both false-positives and false-negatives in this circumstance?

(12) Operational Security Risk, and “Attractive Nuisance”, of Databases: A high price would be paid by tabloid newspapers for a list enumerating the porn-preferences of the Manchester United first eleven. Creation of AV databases — especially “rich” databases with links between network addresses, phone numbers and social media accounts for corroboration — makes an attractive nuisance that will (like the World Anti-Doping Agency database, recently) draw the attention of hackers from around the world. Small AV-providers without oversight are unlikely to invest in the security resources to mitigate such a challenge, let alone pay for employee background-checks to mitigate “insider” risks, etc.

(13) Expansion: stated overtly by organiser of the Age Verification demonstration, and repeated amongst the “industry” attendees, was a general belief that AV is something that is “clearly going to happen”, that there is “no stopping it”, and that (eventually) “a single payment card transaction might not be considered sufficient for regulation in the future”, and that extra authentication would be necessary. I submit this to be evidence of a ratchet mentality, that if AV becomes established it is already expected by its constituency to grow to become more complex and burdensome, even with — perhaps precipitated by — the many technical issues that this posting describes.

(14) Finally, completeness: all of the techniques I observed are circumventable by any motivated teenager. There was frank recognition of this at the AV Demo, that “we won’t really be able to / we aren’t trying to stop the (16+) late-teens” — in which case one must
(via Ranum) compare the goal with the terms in which the enabling legislation is phrased.

(15) It would be entirely possible to implement AV without regard towards its stated “block access to porn for Britons under age of 18” goals. If our goal is merely to implement AV, then any or all of the techniques may be implemented, but:

  1. all of the techniques are circumventable
  2. multiplying or combining them will leave them still circumventable, whilst reducing usability and practicality still further.
  3. at least one of the techniques may have significant collateral impact upon systems which defend us against payment-card fraud
  4. at least one of the techniques operates in direct contravention of the policies of a major source of information that it uses
  5. at least two of these techniques involve the creation of large and sensitive databases — estimates of 25m+ people — which may in future be repurposed for monetisation, e.g.: advertiser web-tracking, data mining, etc.
  6. one of these techniques seeks to leverage any or all of the other techniques; if they are all unfit for purpose, so is it

(16) If our goal however is to restrict children from accessing pornography then none of these techniques will provide complete control; thus in future there will be calls for either or both of:

  1. Massively expensive filtering and blocking at the network level, eventually culminating in implementation of a “British Great Firewall” directly equivalent to the Government-mandated censorship controls which are deployed in China; and…
  2. Stronger bindings of “identity” and “accountability” to internet usage, eventually culminating in some form of “British Digital Identity Card” / “Internet Drivers License”, with all cost and debate that has surrounded physical “identity cards”, but with even greater complexity, even impossibility, to meaningfully achieve.

(17) …so it is wise for us now to consider if we wish to pursue either/both of these two illiberal, unworkable but inevitable paths; or choose to differently address our goals.

(18) I have heard the incompleteness of Age Verification defended as “we should not let ‘perfect’ be the enemy of good” — yes, but neither should we permit ‘wishful thinking’ to ignore reality, and if there are better (and more economic) outcomes to be had through education and engagement than via ever-increasing proscription under a flawed framework, we should consider carefully what we want to achieve — a better society, or deployed and expensive enforcement technologies with unforeseen consequences?

Themes of AV “techniques”

(19) The techniques presented at the AV Demo fall broadly into five categories, being mostly distinguished by the source material for the AV check:

Payment cards

(20) I was live-blogging my observations during the AV demo, and when I wrote that one of the proposals was a “Payment Card Age Check”, an anonymous friend of mine who is a former Qualified Security Assessor (QSA) — i.e. a person who has been certified by the Payment Card Industry Security Standards Council to audit merchants for security standards compliance — responded very critically, and at length. See the original blog for transcript.

(21) His sentiments resonated with those that I have heard expressed by many other security people working in Payment Card Industry (PCI) security over the past five years. Simply: the payment cards industry do not perceive themselves as providers of AV: they do not profit from it, they do not seek to be authoritative regarding it, they will not accept liability for it, and they are not geared-up to provide that information operationally.

(22) The issues that he raised, are:

(23) Expense & Speed: PCI networks are already burdened with payment transactions; to add AV transactions, with no revenue generated, will reduce performance for no gain.

(24) Information Proliferation: every individual has data about them which should not be shared more broadly than is absolutely necessary, for instance it is now understood that “Mother’s Maiden Name” is no longer suitable as an authenticator, because that information is too easily obtainable. For a Government-mandated initiative to teach people that it is “okay” to type your payment-card numbers into random sites on the internet — in order to see even “free” porn — equates to a Government-mandated boom in identity theft and fraudulent transactions, especially given the shoddy security of too-many porn websites.

(25) Incompatibility with “Stand-In Processing” (STIP): STIP is a regular bookkeeping process where, perhaps two nights per month, payment card transaction handling performed by a bank is taken offline and transactions are handled instead by the card association (e.g. Visa, MasterCard), “standing in” for the bank like a baby-sitter.

(26) During STIP the card association — lacking access to the cardholder’s personal data — is apt to say “yes” to any question which is asked of it, including presumably any transactions that would lead to assuming that the user is over 18 years old. This is an edge case , perhaps mitigable by banning types of payment card which might be issued to a 16+ minor, or assuming that the chance of a minor’s exploiting of an adult’s card during STIP is fairly low; but however you cut it, this example helps demonstrate the overall lack-of-intention that payment systems were ever designed for identity- or age-verification.

(27) Harm to payment-card fraud prevention measures: Payment-card AV checks are implemented by placing a “hold” on a small (or even zero) sum of money — say £1 — and then refunding/releasing it. This check tests the current validity of the card and therefore implicitly tests the age of its owner, where cards are only issued to people aged 18+

(28) From the burst of transactions that AV will generate, we expect that use of card checks to implement AV will risk the following impact upon payment-card anti-fraud systems:

  1. The anti-fraud systems worry about low-value (e.g.: £1) transactions because they are a technique used by fraudsters to validate stolen card credentials. Dilution or swamping of these checks in order to support AV at nationwide “porn” scale will have negative consequences upon the fight against card fraud.
  2. Reciprocally: any service which sets-up to provide specifically AV-focused card-check services will be used by fraudsters to validate stolen credentials.
  3. The risks associated with low- or zero-value transactions are currently mitigated by most banks having hard rules about the number of transactions permitted per day. If you watch enough porn you may not be able to buy your groceries on the same day.

(29) Also, as ever, many kinds of payment-card check may be trivially, obviously and secretly bypassed by teenagers who “borrow” a parent’s card without incurring cost.

Social Media

(30) As a former Facebook software engineer, I was horrified at a proposed AV technique based upon “social media profiling”. The technique requests the following information from your Facebook account:

public profile, friend list, email address, timeline posts, relationships, birthday, work history, education history, events, hometown, current city, photos, likes, tagged places

(31) …which the AV-provider copies and performs “machine learning” against, in order to establish whether they believe you are 18+ — apparently seeking incongruities in your profile information. They also actively seek to link this information with your Twitter, LinkedIn and Paypal (and other?) accounts, in order to build a fuller profile.

(32) This is a request for a copy of almost every significant piece of content that is associated with your Facebook account, including a graph of who your friends are, the photos that you take, what you “like”, where you live, and enough information to link your identities to other social network “silos” without necessitating your permission. The makers presumably seek similar information of other social networks.

(33) All of this data is to be stored under uncertain controls, with unclear access restrictions, unclear user-ownership, user-review and content-deletion policies, and none of these matters (nor associated liabilities) have been addressed in the draft bill, nor in the industry’s attempt at formalising a draft operational standard, BSI PAS 1296.

(34) I consider social-network-mining AV techniques to be particularly:

  1. Misconceived (“unwise from the broader perspective of security”)
  2. Gratuitously invasive (“give us all your data”)
  3. Disproportionate (“give us all your data… to prove that you are 18+”)
  4. Risky (being aggregated where? by whom? who can see it? how does one control it? does the data get used by, shared with, or sold to third or fourth parties?)

(35) To this I will also add that I consider the technique to be:

(36) Dubiously effective: a spot-test with the proposed solution, using a fake-but-plausible Facebook account, led to an approval for porn access.

(37) Against Facebook’s official “Platform Policy”, per the following:

Don’t use data obtained from Facebook to make decisions about eligibility, including whether to approve or reject an application or how much interest to charge on a loan. https://developers.facebook.com/policy

(38) As with the Payment Card technique above, I have a general sense of this technique having been developed in vacuum and without the buy-in of the information sources used by the technique. This is not just to build an AV technique on sand, but to actually build one on quicksand, because it is entirely possible that in the future the Payment-Card or Social Networks may some day shrug off these roles which Government-mandated AV providers are attempting to thrust upon them.

Mobile Phone Accounts

(39) On the back of the previous two examples, it should suffice for me to summarise this technique as “if you have possession of a mobile phone and can use it, then you must be 18+, because we infer from the telco-database that the phone-user is 18+”.

(40) This inference obviously rests on age data being accurate — another possibly unwarranted assumption — plus the leap from “physical phone possession” to “age of phone user” may be incorrect.

(41) There are also significant challenges in getting a Pay-As-You-Go device marked as being owned by someone who is 18+, causing increased likelihood of false-positive AV blocks. Pornography sites will lose sales on account of incorrect blocks due to PAYG, and (from a social perspective) improper PAYG blocking will also disproportionately impact those with lower incomes where PAYG is beneficial. See Annex for link.

(42) Also — if this technique is adopted — then additional methods of AV will need to be available in order to address the needs of tourists, visitors, and/or foreign networks which geolocate to the UK. There will never be a single, “one-stop” AV technique, leading to a “race to the bottom” where the most easily used (and easily bypassed?) means of AV, wins. Again, consider Ranum’s Law.

“New Technology”

(43) With opportunity comes innovation, and one proposal presented struck me both as interesting and empowering — providing the user with some degree of control over their data — but was highly risky in terms of the upfront demands for Government ID which would then be shared with outsource fourth parties for “verification” services, raising yet more “proliferation” issues.

“Existing Intermediaries”

(44) The fifth theme is most simply described as “let’s leverage whatever databases or tools we already have control over, turn that into an AV service, and then monetise it”. These techniques are delightfully simple — almost primitive — from a technical standpoint, but with that simplicity come enormous potential political and social ramifications, depending upon whom the service provider is.

Common Themes

(45) Before final recommendations, I would like to step back from AV and make a few general observations on the bill, and on proposed AV techniques:

Scale, Risk & Cybersecurity Mis-education

(46) Regarding “Scale”: the representative from MindGeek said that they expect their technique to be serving AV for 25 million people within one month of deployment; this constitutes 39% of the British population, and is comparable to the number of people in the UK who have a Facebook account (from public sources, estimated 30 million).

(47) Yet this “Age-Verification Social Network” — and its associated databases and great cost — would not actually exist but for the AV provisions of part 3 of the draft bill.

(48) Also: from a “cyber” perspective it is a very unwise idea to habituate 39% of the British populace into improper security patterns such as:

  • The trade of social media data for access
  • The provision of photos of Government ID, to random apps/websites, for access
  • Typing your phone number into random websites
  • Typing your payment card details into random websites
  • …especially typing them into hackable, poorly-implemented websites which may have been coerced (via “cross-site scripting”) to ask the user for their secret “CVV” number from the back of their card, siphoning that data off to enable fraud

Regulation and the Regulator

(49) The Bill makes no provision for approval and banning of particular AV systems, thus no guarantee can be made about the suitability and security of AV systems that will emerge.

(50) As the Bill stands, none of the risks detailed in this document can be assessed, prevented or mitigated by the AV Regulator. Instead, some of the worst consequences might (at best) be punished via data protection law, after matters have gone seriously wrong, after the porn-preferences and porn-habits of notable people — celebrities, sportsmen & women, and politicians — and less famous people, have all been published on (say) Wikileaks.

(51) It is of even greater concern that the embryonic AV industry is flying headlong into deployment with only a single, vague and incomplete draft standard (BSI PAS 1296) to guide it — e.g. the PAS 1296 draft mentions data protection only in terms of “data processing within the European Union” without covering operational security, nor security implementation details such as those defined by (e.g.) the Payment Card Industry Data Security Standards (PCI-DSS) for the storage of sensitive information.

Recommendations

Regarding AV techniques that are backed by payment card checks

(52) Before instituting use of Payment Cards as a mechanism for AV, the committee should at length discuss with senior security and operations representatives of the Payment Card Industry (PCI) in general, as to whether they consider AV to be a proper use of Payment Cards and PCI systems.

Regarding AV techniques that are backed by social media information

(53) I appeal to the committee, and to my former colleague Baroness Shields — who should be likewise familiar with the risks posed to vulnerable people from unnecessary proliferation of their Facebook profile information — to look again at the impact of not only social-media-based, but also of all proposed AV implementations.

(54) Also, for Baroness Shields to please consider from her perspective as Parliamentary Under Secretary of State for Internet Safety and Security whether what is being wrought in the name of age verification and child protection might not actually be a net-negative for Privacy, Internet Safety and Security, both for children and other people.

Regarding AV techniques which build databases of porn usage

(55) The committee should institute a ban upon indirect monetisation of, or derivative use of, all data collected or inferred, directly or indirectly, for the purposes of AV; this will also beneficially prevent “inference” from being shared as “fact”.

Regarding protection of AV data

(56) The committee should require the adoption of controls similar in spirit to the Payment Card Industry Data Security Standards (PCI-DSS) for storage of all information that is by any party used for the purposes of AV. This would include but not be limited to the requirement that data must not be stored once an authorisation is received. Thus once an identity is AV-proofed, data collected to proof the age must not be retained by (e.g.) the porn site.

(57) The committee should clarify what penalties the Information Commissioner and the Regulator will (especially post-Brexit when the matter will become unclear) impose upon AV providers who leak sensitive data, including but not limited to:

  1. The fact that a person is aged 18+
  2. Raw “Personally Identifiable Information”
  3. Information mined “about” the user
  4. Which sites requested AV for which people
  5. Which people sought AV from which sites

(58)…especially 4 & 5 which would constitute mass leaks, or could lead to porn- and gender-preference “profiling”; compare with the recent “Ashley Madison” leak.

Regarding the correctness of AV determination

(59) The committee should clarify the burden of liability for incorrect AV, including:

  • punishment for false negatives (AV said 18+ when not), and…
  • compensation for false positives (AV said -18 when not)

(60) The committee should clarify whether it will be a criminal/civil offence to lie to an AV provider, and how this would address erroneous AV determinations made by AV providers themselves, inferring upon the basis of incorrect or fraudulent data?

Regarding the impact of AV upon pornography SMEs

(61) Regarding the following text from the bill:

23 Exercise of functions by the age-verification regulator

(1) The age-verification regulator may, if it thinks fit, choose to exercise its powers under sections 20 and 22 principally in relation to persons who, in the age-verification regulator’s opinion —

(a) make pornographic material or prohibited material available on the internet on a commercial basis to a large number of persons, or a large number of persons under the age of 18, in the United Kingdom; or

(b) generate a large amount of turnover by doing so.

(62) I note that independent SME porn producers risk being pushed out of their enterprise — and possibly their livelihood — due to the cost of services and regulation to which the bill will require them to adhere.

(63) Section (a) is redundant because any document placed upon the Internet is inherently “available to…a large number of persons” — several billion at last count — and (b) already covers the “commercial basis” aspect, therefore I recommend that the draft bill:

  • Either have section (a) excised in entirely
  • Or — if the text’s intent is to capture into regulation only “large” porn sites:
  • That the committee seek to clarify the text, and…
  • Replace the trailing “or” from section (a) with “and”, in order to relieve regulatory AV burden upon pornographic SMEs.

Regarding the responsibilities of the Regulator

(64) The Bill lacks any provisions that allow the Regulator to choose whether one AV system or another is, or is not, fit for use.

(65) To achieve standards of security, and avoid the problems I outline above, the Bill must create duties for the AV regulator which require them to:

  1. Regulate the AV providers
  2. Halt business operations by AV providers who are not regulated
  3. Approve or disapprove providers’ AV techniques and implementations thereof
  4. Halt business operations by AV providers whose techniques and implementations are not approved
  5. Enforce AV-provider standards based upon publicly-agreed criteria including:

(66) Criteria:

  1. Storage of data under recognised standards of both security operations
    (eg: ISO27001) and protection of sensitive data (eg: PCI DSS)
  2. Regular audits of standards security compliance
  3. Maximising the privacy of people who are the subjects of AV
  4. Maintaining the security and implementation of data sources used for AV
  5. Maintaining non-linkability of AV checks between (porn) sites that consume AV
  6. so that “naughty-gardening.com” and “nude-cookery.com” cannot conspire by means of AV data to link the identity of someone accessing both websites.
  7. Proof that data collected for purposes of AV is not used or reused for other purposes

(67) The regulator must provide a “portal” to all regulated AV providers, permitting people direct access to see, review and contest AV-related data which is held about them; compare this with the availability of “credit reports” to their subjects

Regarding Digital Economy Bill Age Verification provisions as a whole

(68) Even if all my recommendations above are adopted, with such breadth of matters still up for consideration of implementation and consequence, I would recommend the excision of all Age Verification provisions from the existing draft Digital Economy Bill, to be carefully reconsidered and submitted instead under a wholly separate bill.

ANNEX

(1) Review of Proposed AV techniques

(2) Gaming Industry Techniques

Method:

  • This demonstration was a digression into how the Gaming industry has many potential solutions to address this space.

Issues:

  • Regulated Gaming deals squarely with exchanges of money on the basis of risk, so in addition to age-regulation there are anti-fraud and other measures which demand a strong concept of identity between vendor and client.
  • However: Porn on the Internet is not purely a paid-for commercial matter; there are also all of:
  • free-to-consume (paid via sidebar advertising)
  • free-to-consume (as a side-effect of a commercial livelihood, eg: authors)
  • free-expression (as in “porn for art’s sake”)
  • political activism / awareness-raising
  • other rationales, too numerous to describe
  • As such, AV solutions incorrectly predicated on a “porn is like gambling” model — and indeed the mandated requirement of AV for access to porn in general — will have chilling effects upon these expressions from an assumption of profit motives.

(3) The SMS Technique

Method:

  • Visit porn website
  • Enter phone number
  • Receive challenge SMS and respond with “VERIFY”
  • Access Porn

Issues:

  • Fake porn-sites abound
  • Train people to share their phone numbers
  • Fake websites sell phone numbers onward for marketing or extortion
  • SMS is no longer considered a secure; open to forgery & interception, etc.

(4) The Social Media Technique

Method:

  • Described at length in the main text

Issues:

  • Described at length in the main text
  • Also: similar to mobile, raises an accessibility issue for foreign cards, tourists, etc.

(5) The Specialist App Technique

Method: (as described)

  • Install an app
  • Use app to take ‘selfie’ to prove that you are human
  • Use app to take picture of Government ID
  • (Selfies and ID are sent to third and fourth parties for verification and validation)
  • You now have an ‘identity’ and may visit porn sites
  • Visit porn website, be shown QR-code-like “challenge”
  • Scan code using app; app tells user how much data the site wants
  • User confirms; app validates to website.
  • Access Porn

Issues:

  • Complex-ish setup & use, demands use of a smartphone
  • Who receives and holds copies of the Government ID
  • What on-phone data (including location) may the app access?
  • “SAVVIcode” (Millican/Stajano, 2014) — QR-code may be subject to substitution

(5) The Outsource Credit Reference Technique [typo: dup number]

Method:

  • User gives information to AV-provider, permitting the latter to perform a credit check
  • Third party informs the website that user is 18+

Issues:

  • This is essentially glorified password authentication; teens with knowledge can pretend to be their parents, or even trade in stolen credentials
  • Will the AV-provider cache the information? How safely? Anonymised how?

(6) The Payment Card Technique

  • Described at length in the main text

(7) Federated Login

Method:

  • MindGeek are offering something quite clever; rather than create a technology of their own they promise to use “anything that works” from the list of others, and will bundle up results in an easy-to-use interface which porn sites may buy as a service.

Issues:

  • This solution provides a godlike perspective upon flows of a vast quantity of porn-related traffic — who logs into which sites first, what other sites they go to, which sites are popular (and perhaps which they might like to acquire?) — as well as potentially bounce rates, referral rates, and other analytics.
  • They apparently have no plans to launch an ad-tracking network a-la DoubleClick , but with their expectation of “25 million UK users in the first month” it would be hard to justify not monetising the potential a few years down the line.
  • A hypothetical outage of this service — DDoS, or DNS censorship — could inhibit porn-access by almost the entirety of the UK, making it an interesting single point of failure in any scenario.
  • However: they also need at least one acceptable AV technique that will work.

(8) The Credit Reference Agency Technique

Method:

  • Identical to the “Outsource Credit Reference” above, but performed by Experian.

Issues:

  • As per “The Outsource Credit Reference Technique” but now your credit reference agency knows your porn preferences.

(9) The Payment Card Possession Technique

Method:

  • Use a special App which can securely confirm that you possess a kind of payment-card which only people aged 18+ may use

Issues:

  • Risks mostly as-per the “Specialist App Technique”, above
  • Also, if this is a passive check, teens may “borrow” a parent’s card

(11) The E-Wallet Technique

Method:

  • Create a new digital currency
  • Get people to buy “coins” up front using age-verified payment
  • Use coins to watch porn.

Issues:

  • Not all porn is paid for, as in “Gaming”, above
  • What prevents teenagers from spending coins bought by an adult?
  • Yet another digital currency

(12) Endnotes

(13) The Importance of False-Positive Rates

  • Any AV test that always returns an “Under 18” result will be 100% accurate at identifying people who should not be allowed to see porn.
  • Similarly, any AV test that always returns “Age 18+” will be 100% accurate at identifying people who are allowed to see porn.
  • Error rates are important; using MindGeek’s statistic of 25 million people, an AV test which misclassifies 0.1% of adults as “Under 18” will block 25,000 voters from their informed choice to access porn.
  • The misclassification rate will likely be much larger than 0.1%.

(14) Challenges of AV for Mobile Pay-As-You-Go (PAYG) Devices

  • I have previously documented the complexity of having PAYG devices cleared for adult access, at this URL: https://goo.gl/70uxxb
  • To remove adult-content blocks can require an in-person visit to a phone shop, discriminating against those on low incomes for whom PAYG is preferable / sometimes the only option.
  • This impacts all AV techniques which leverage telco user databases
  • SMS, possibly also Apps

--

--