A Sequence of Spankingly Bad Ideas
An encounter with those who seek to gain from the Digital Economy Bill’s mandatory “Age Verification” requirements
[ update: 28 October 2016 — this article amended and submitted to parliament ]
“A journalist, a human-rights activist, a lawyer, a software engineer, and a feminist pornographer walk into a bar…”
Last Thursday, with friends and colleagues from Open Rights Group, I spent a few hours at the Adult Provider Network’s Age Verification Demonstration (“the demo”) to watch demonstrations of technologies which attempt to fulfil Age Verification requirements for access to online porn in the UK.
Specifically: Age Verification (“AV”) is a requirement of part 3 of the Digital Economy Bill that seeks to “prevent access by persons under the age of 18” to “pornographic material available on the internet on a commercial basis”.
There are many contentious social and business issues related to AV, for instance:
- that alternate-sexuality-themed content may no longer be available to people below the age of 18 who could benefit from it constructively.
- the risk of independent “SME” porn producers being pushed out of their enterprise — and possibly their livelihood — due to the overbearing cost of services and regulation which the Government will require them to adhere to. Details of this and many related issues are included in Pandora Blake’s blog on this topic.
- the phrase “commercial basis” is currently left to interpretation, so it is unclear whether (e.g.) the blog of an author of “erotic e-books” with sidebar advertising for his/her own work, would be considered “commercial” and thus require regulation, age-gating, and payment of fees to both the regulatory authority and the age-check services to which they must subscribe.
So there are many open questions and many criticisms of the Digital Economy Bill’s provisions; but to date there appears to have been no critical appraisal of the proposed technologies for AV, and so that is what I seek to address in this posting.
I am a freelance software engineer and network engineer, specialising in security, privacy, integrity and cryptography for more than 25 years.
In this time I have worked directly or indirectly for corporate network security teams, hardware & software vendors, built systems for ISPs, telcos, retail & investment banks, worked for government & defence “cybersecurity” systems integrators, and most recently have worked for 3 years as a software engineer for a leading social network.
For several years I have also been a member of the Board of Directors of the Open Rights Group, a UK digital-rights campaigning and activist organisation.
I have a deeply internet-technology-based perspective of the challenge of AV, and in this posting I would like to outline my impressions of:
- The “implementability” of AV
- The themes of the various proposed technical solutions to AV
- A deep-dive, looking at each of the AV solutions as-presented
These are personal perspectives and — especially when I get down to the “deep-dive” — are on the basis of notes that I made at speed, so there may be some minor errors but I hope to give a reasonably fair, but not uncritical, presentation. If errors are pointed out to me, I shall amend and annotate this posting.
The “implementability” of AV
There are perhaps five major challenges for AV implementation with respect to its intention:
- Ranum’s Law: named after Marcus Ranum, the eponymous law states that “you cannot solve social problems with software” and requires one to consider frankly whether available technological solutions are fit for one’s goals, and if not whether one should redraw one’s goals.
- GIGO: There is a general principle in computing that “Garbage In = Garbage Out” (GIGO) —and now consider that all of the age verification mechanisms actually leverage some kind of substitute (or “proxy”) that “proves” the age of the porn-viewer. Where the proxy (age of phone-user, age of Facebook-user, …) is malleable or bogus data, the subsequent AV decision will also be bogus. (Aside: a question apparently left unanswered in the draft Digital Economy Bill is: who carries liability in this circumstance? There is a lot of discussion of the means of regulation in Part 3 of the bill, but apparently nothing regarding the actual act of “Age Verification”?)
- OPSEC: A high price would be paid by tabloid newspapers for a list enumerating the porn-preferences of the Manchester United first eleven. Creation of AV databases — especially “rich” databases with links between IP addresses, phone numbers and social media accounts for corroboration — makes an “attractive nuisance” which will draw the attention of hackers around the world. Small AV-providers are unlikely to have the operational security (OPSEC) resources to face such a challenge, let alone “insider” risks.
- Expansion: stated overtly by the demo organiser, and amongst the “industry” attendees, there was a general sense that AV is something which is “clearly going to happen”, that “there is no stopping it”, and that (eventually) “a single credit card transaction might not be considered sufficient for regulation in the future”, and that extra authentication would be necessary. I submit this to be evidence of a ratchet mentality, that if AV becomes established it is already expected by its constituency to grow to become more complex and burdensome, even with — perhaps precipitated by — the many technical issues that this posting describes.
- Finally, completeness: all of these mechanisms are circumventable by a motivated teenager. There was a general recognition of this at the AV Demo, that “we won’t really be able / we aren’t trying to stop the late-teens” — in which case one must (via Ranum) compare the goals with the terms in which the enabling legislation is phrased.
It would be entirely possible to implement AV without regard towards its actual goals; this would be as famously lampooned in “Yes, Prime Minister”:
We must do something!
[This] is something!
Therefore we must do it!
If our goal is to implement AV then any or all of the solutions may be implemented; however:
- all of the mechanisms are circumventable
- multiplying or combining them will leave them still circumventable, whilst reducing usability and practicality still further.
- at least one of these mechanisms may have significant collateral impact upon mechanisms which defend us against fraud
- at least one of these mechanisms operates in direct contravention of the policies of major source of information that it utilises
- at least two of these mechanisms involve the creation of — presumably huge — databases which may be repurposed in future for monetisation, e.g.: advertising web-tracking, data mining, etc.
- one of these mechanisms seeks to leverage any or all of the other mechanisms; if they are unfit for purpose, so is it
If our goal however is to restrict children from accessing pornography, then none of these mechanisms will provide complete control, leading either to the abandonment of AV, or to the aforementioned “ratchet” being applied (“More, Harder, Stronger Assurance!”) likely eventually becoming the basis for a UK “Digital Identity Card”.
- A ban upon indirect monetisation or derivative use of all data collected or inferred, in either case directly or indirectly, for purposes of AV; this will also prevent “inference” from being shared as “fact”.
- Adoption of PCI DSS Standards for storage of all information used for the purposes of AV, specifically but not limited to the requirement (in DSS) that sensitive data must not be stored once an authorisation is received. So, by analogy, once an identity is AV-proofed, the data used to proof the age must not be retained by the proving entity.
- Clarification of the burden of liability for incorrect AV, including punishment for false positives (AV said 18+ when not) or compensation for false negatives (AV said -18 when not).
- Clarification whether it will be a criminal/civil offence to lie to an AV provider? Will this include erroneous AV decisions made by AV providers themselves inferring upon the basis of bad data?
- Statements of liabilities / fines which (especially post-Brexit when the matter will become unclear) the Information Commissioner will bring to bear upon AV providers where they leak sensitive AV information; not only the fact of age 18+, but also which sites requested AV for which people, etc, which can lead to porn- and gender-preference “profiling”.
The broad themes of AV “solutions”
To my mind the solutions presented at the AV Demo fall into five broad categories, mostly distinguished by the source material for the AV check:
“Let’s use Credit Cards!”
I was live-blogging my observations during the AV demo, and when I wrote that one of the proposals was “Credit Card Age Check”, a friend of mine who is a former QSA — a Qualified Security Assessor, i.e. a person who has been certified by the Payment Card Industry Security Standards Council to audit merchants for security standards compliance — responded thusly, verbatim:
Oh dear god no.
No no no no
2) proliferates cc [credit card] data in a way it was not meant to be used
4) Vulnerable to STIP [stand-in processing] attacks
5) messes with OTB [open to buy]
Dear god no
His sentiment is both clearly stated and very negative, and it resonates with sentiment that I have heard expressed by many other security people working in Payment Card Industry (PCI) security over the past five years.
Simply: the payment cards industry do not perceive themselves as providers of AV: they do not profit from it, they do not seek to be authoritative regarding it, they do not want liability for it, and they are not geared-up to provide that information operationally.
His specific points are worthy of review:
1/2) Expensive/Speed: payments networks are burdened with payments transactions; to add AV transactions on top of the payments, with no revenue generated, will reduce performance for no net gain.
3) Proliferation: every individual has data about them which should not be shared more broadly than is absolutely necessary. It is now generally understood that “Mother’s Maiden Name” is not suitable as information to act as an authenticator, because that information is too easily obtainable.
For a Government-mandated initiative to teach people that it is “okay” to type your credit-card numbers into random sites on the internet — in order to see “free porn” — equates to a Government-mandated boom in identity thefts and fraudulent transactions; especially given the shoddy state of implementation (XSS & CRSF vulnerabilities ) of many porn websites.
4) STIP: “Stand-In Processing” (STIP) is a regular bookkeeping process where, perhaps two nights per month, all of the cards of a given bank are temporarily taken offline and transactions are handled by the card association, instead.
Hypothetical: the “Royal Bank of Westminster” issues credit cards which are processed by “VeezaCard”; RBW possess all the cardholder’s personal information whereas VeezaCard generally shunts transactions back and forth between the customers, vendors, and RBW.
However: during “STIP”, RBW asks VeezaCard to “stand in” for the bank during transactions — like a babysitter — whilst the bank goes off to reconcile the last two weeks of transactions. During STIP, VeezaCard will pretend to be RBW and will handle any/all transactions on RBW’s behalf … but with little benefit of RBW’s “parental knowledge”, i.e.: customer information.
Thus, during STIP, VeezaCard is apt to say “yes” to any question which is asked of it, including presumably any transactions that would lead to assuming that the user is over 18 years old. This is clearly an edge case — perhaps mitigable by banning types of credit card which might be issued to a minor, or assuming that the chance of a minor’s exploiting of an adult’s credit card during STIP is fairly low; but however you cut it, this example helps demonstrate the overall lack-of-intention that payment systems were ever designed for identity- or age-verification.
5) OTB: “open to buy” (OTB) is the amount of money defined as the credit limit on your card, less your outstanding balance, less the authorised “holds” placed on your cards by different vendors (e.g. a “deposit” on a rental car). Credit-card AV checks are implemented by placing a “hold” on a small (or even zero) sum of money — say £1 — and then refunding/releasing it.
This has the following impact upon PCI anti-fraud systems:
- Credit-card anti-fraud systems worry about low-value (£1) holds because they are a technique used by fraudsters to validate stolen card credentials. Dilution or swamping of these checks in order to support AV testing at nationwide “porn” scale will have negative consequences upon the fight against card fraud.
- Reciprocally: any independent service which sets-up to provide porn AV services will be used by credit fraudsters to validate stolen credentials.
- The risks associated with low-value / zero-value transactions are currently mitigated by most banks having hard rules about the number of transactions permitted per day. If you watch enough porn you may not be able to buy your groceries on the same day.
Recommendation: before instituting use of Payment Cards as proxy for age verification of Porn in the UK, someone should seriously talk to multiple PCI QSAs, and also to senior representatives of the Payment Card Industry in general, as to whether they consider AV to be a proper use of Credit Cards and PCI systems.
“Let’s use Social Media!”
As a former Facebook software engineer, I was horrified by the AV solution proposed by Veridu. This will tread heavily upon my deep-dive, however Veridu requests the following information from your Facebook account:
public profile, friend list, email address, timeline posts, relationships, birthday, work history, education history, events, hometown, current city, photos, likes, tagged places
…data of which they make a copy, and then perform “machine learning” against this data in order to establish whether they think you are 18+, apparently looking for incongruities in your profile information.
Veridu want access to your timeline updates, your photos, all of your personal metadata… essentially a copy of every significant piece of static content that is associated with your Facebook account. This will include a graph of who your friends are, what you “like” — plausibly enough information to link you to other social networks. All of this stored under uncertain controls, with unclear access restrictions and deletion policies.
Veridu actively seek to link this information with your Twitter, LinkedIn and Paypal (and other?) accounts, in order to build a fuller profile.
I consider Veridu’s proposed AV solution to be:
- misconceived (“a bad idea”)
- gratuitous (“all your data”)
- disproportionate (“all your data… to prove that you are 18+”)
- risky (“being aggregated where, and who can see it?”)
To this I will also add:
- dubiously correct; a spot-test with a fake-but-plausible Facebook account led to an approval for porn access
- against Facebook’s official “Platform Policy”, per the following quote:
Don’t use data obtained from Facebook to make decisions about eligibility, including whether to approve or reject an application or how much interest to charge on a loan. https://developers.facebook.com/policy/?attachment_canonical_url=https%3A%2F%2Fdevelopers.facebook.com%2Fpolicy%2F
As with the Credit Card solution above, I have a general sense of this solution having been developed in vacuum and without the buy-in of the information sources which back the solution. This is not just to build a solution on sand, but actually to build one on quicksand, because it is entirely possible that in the future the Credit-Card or Social Networks may shrug off roles which Government-mandated AV solutions are attempting to thrust upon them.
Recommendation: I appeal to my former colleague Baroness Shields — who should be likewise familiar with the risks posed to vulnerable people from unnecessary proliferation of their Facebook profile information — to look again at the impact of not only this but of all proposed AV implementations, and consider from her perspective as Parliamentary Under Secretary of State for Internet Safety and Security whether what is being wrought in the name of age verification and child protection might not actually be a net-negative for Internet Safety and Security.
“Let’s use Mobile Phone Accounts!”
On the back of the previous two examples, it should suffice for me to summarise this as “if you have possession of a mobile phone and can use it, then you must be 18+, because we infer from the telco-database that the phone-user is 18+”.
This is obviously subject to GIGO, and also it’s unclear how robust (or how accurate) the leap from “phone possession” to “age of phone user” may be.
Also: I have documented elsewhere the challenges of getting a Pay-As-You-Go device marked as being owned by someone who is 18+ (and in which I recap, in 2014, some of the credit-card themes that I describe above) — so there is considerably likelihood of False-Negative AV results, where porn sites will lose sales on account of incorrect AV adjudications.
“Technological Wet Dreams!”
There was actually some interesting technical implementation presented at the demo — some of which may be too “hardcore security”, too “high user-experience friction” to be deployed in practice — but I’ll defer detailed discussion to the deep-dive section.
“Piggy In The Middle”
The fifth theme is simply described as “let’s leverage what we already understand or have already got control over, turn that into an AV service, and then monetise it”.
These solutions are delightfully simple — almost primitive — from a technical standpoint, but with that simplicity comes enormous potential political and social ramifications, depending upon whom the service provider is.
Before pitching into the individual solutions, I would like to step back from AV and make a few general observations:
- Regarding “Scale”: the representative of MindGeek said that they expect their solution to be serving AV for 25 million people within one month of deployment; that constitutes 39% of the British population and (from public sources) is comparable to the number of people in the UK who have a Facebook account (~30M)
- Yet this “AV Social Network” would not actually exist, but for the AV provisions of Part 3 of the Digital Economy Act.
- It’s a really bad idea to habituate the 39% of the British populace into bad security patterns, such as:
- …normalising the exchange of social media data, for porn access
- …typing your phone number into random websites
- …typing your credit card numbers into random websites
- …especially into XSS-able random websites which can be coerced to ask for (and siphon-off) the CVV number from the back of the card
- It’s also a really bad idea for this embryonic AV industry — which is springing to life because of the demands of one part of one draft bill — to be moving headlong into deployment with only a single draft standard (BSI PAS1296) to guide it, one that makes no mention whatsoever of operational security (OPSEC). The current PAS1296 draft mentions data security only in terms of “data processing within the European Union”, hence my recommendation for the borrowing of the PCI DSS standards regarding protection and storage of data.
Deep Dive into Individual Solutions
This bit may be a bit too technical for some; feel free to skim.
1) The SMS Mechanism — Verime
The first proposal was very simple: go to porn site, the age verifier asks for your (UK) phone number, sends you an SMS, to which you reply with “verify” or some other word, and then the age verifier trusts the telecom provider to report whether the phone owner is over 18.
Issues: Fake porn-sites (especially outside the UK) abound; with this mechanism you are training people to give their phone numbers to untrusted websites (whilst still “rewarding” them with porn) and the websites can sell/give these numbers onward to marketing companies.
The mechanism is geared towards UK phone numbers; visitors who geolocate to the UK will have to find/use alternate means.
Authentication can be done with a parent’s phone when they are not looking, or children can get their own phones on their parents’ bill.
Also, perhaps VOIP is an issue?
2) The Social Media Mechanism — Veridu
This was described at length above.
3) The “ID On Your Phone” — Yoti
This one I felt to be quite cute and thoughtful, but perhaps over-engineered. The setup process is as follows:
- install the Yoti App
- use the app to take a selfie to determine that you are a human being
- use the app to take a picture of Government ID documents
- the app sends both documents to Yoti
- Yoti (the third party) now send both pictures to a fourth party; it was unclear whether personal data (e.g. passport details) is stripped before sending to the fourth party
- Fourth party tells Yoti if the images (selfie, govt id) match
- Yoti caches various personal data about user.
Then, the process to use:
- visit porn site
- porn site posts a QR-like code on screen
- user loads Yoti app
- user has to take selfie (again) to prove that it is (still) them / still the verified person, not a kid using the phone
- user scans the on-screen QR-code, is told: “this site wants to know if you are >18yo, do you approve?”
- User accepts
- Yoti app backchannel informs porn site (via callback URI?) that user >18yo
- user sees porn
Issues: This seems a massively complex setup for an occasional payoff, with high user-experience friction. I find it delightful that the user is being involved in being electively asked what information to share from the identity provider, however it’s unclear to me how far the scans of government ID documentation will be propagated. Is the outsourced facial recognition provider / fourth party in possession of a copy of the user’s passport or driver license? What can they (or Yoti) do with that information?
What privileges does the Yoti app run with, on the phone? Will it be mining/storing geolocations, adding yet another tracking signal? Apparently “it can use GPS.”
Also: this architecture seems potentially vulnerable to the “Mafia” attack from Millican & Stajano’s SAVVIcode paper, in that the QR code is not strongly bound to the site which is ostensibly issuing the verification challenge. XSS or browser malware would likely enable replacement.
4) The Credit Reference Data Check — name?
The user provides information against which the age-verifier service can perform a credit reference check, establish an age for the user, and return a result to the porn site that the user is/is not older than 18.
Issues: How is this proof against a replay attack, e.g.: a determined teenager who knows a parent’s age-verifier password?
5) The Credit Card Age Check
The user performs a minor credit-card hold or payment; this — and its challenges — are largely described in the “Credit Card” section, above.
Also, certain debit cards are elided from check-validity because they are available for people 14+, which is below the 18+ threshold.
6) Federated Login — MindGeek
MindGeek are offering something quite clever; rather than come up with a technology of their own they promise to use “anything that works” from the list of AV technologies, and will bundle it/the results up in an easy to use interface which porn sites can buy as a service.
You will use OpenID on the first porn site to unlock an AgeID cookie, and then all subsequent / other porn sites with automatically be pre-unlocked with this single authenticator.
Issues: This solution provides MindGeek a godlike perspective on a vast quantity of porn-related traffic — who logs into which sites first, what other sites they go to, which sites are popular (and perhaps which MindGeek might like to acquire?) — as well as potentially bounce rates, referral rates, and a bunch of other analytics.
MindGeek apparently have no plans to launch an ad-tracking network a-la DoubleClick — I asked — but with their expectation of “25 million UK users in the first month” it would be hard to justify not monetising the advertising-tracking potential some few years down the line.
The MindGeek presentation was the first where I heard one of the speakers refer to data protection (“this is not going to be another Ashley Madison”) — referring to the systems as being “PAS 1296 Compliant”, referring to the draft “Online age checking — Code of practice” document. My review of PAS1296 (login required for access) shows that it covers “data protection” in a document annexe — but only to the extent of regulatory protection (e.g.: the data must be processed in Europe, sort of thing) without reference to operational security standards such as PCI-DSS.
Also: a hypothetical outage of the MindGeek AgeID service — DDoS, or DNS censorship — could inhibit porn-access by the entirety of the UK, making it an interesting single point of failure in any scenario.
7) Experian Digital Identities — Experian
Create an Experian Digital Identity. Experian know how old you are. Log into porn sites. Experian is apparently happy for their brand to be associated with logging-into porn sites, because “social responsibility”
Issues: You are logging into a porn site with your credit reference agency password. Does anyone else feel that’s a bit weird?
Also: fake websites harvest your Experian password.
8) Credit Card Metadata — Zerado
Posit: certain forms of credit card are available to 18+ people only, so we can use an app on a mobile phone to read the card over NFC, and then pattern-match the PAN (i.e. the long card number) against a database of card-types which are only issued to people who are 18+.
This verifies you as being 18+ because you physically possess an 18+ credit card. Presumably reading the card over NFC is important because typing-in the PAN would be too easily forgeable. Apparently no actual transactions happen with your card, it’s all a proof-of-possession thing.
Issues: audience member: “What if you borrow your mum’s card?” — I didn’t catch the response to that question, I’ll update this posting if I get one.
paraphrasing: “We’re from the gaming industry, we’ve already got this sorted, we have a suite of solutions. Can’t really demo them.”
No slides. Pleasant chap, was very polite about questions relating to the matter of AV for non-chargeable sites.
Gaming solutions seek to be non-circumventable — it is illegal for -18 people to participate in gambling (excepting National Lottery which is 16+) so there is a legal burden is upon the player which the companies must seek to inhibit — and by design, in gaming, money must be involved. The Digital Economy Bill’s 18+ content restriction is not practically intended — according to the people in the demo room — to actually apply to 16+, so there is a both a flaw in bill’s design goals, plus “gaming” does not constitute an apples-to-apples analogue for porn restriction.
10) e-wallet micropayments — ICM Registry
Apparently ICM Registry operates the .xxx (pronounced “dot triple-X”) sponsored top-level domain (sTLD) registry, which is designed for pornography. They propose an e-wallet with a virtual currency that can be used for micropayments. If you go to a site and view nothing, then no charges will apply.
Issues: Do we need yet another e-currency? Not all porn on the Internet will be charged for, e-payment requires money up-front to obtain coins/tokens, “free” sites will have to riffle through your e-wallet to establish that you possess one, even if they take no “money” from it.
There were two more speakers: first Chris Ratcliff from Portland TV, whose themes were primarily that people are getting becoming more accepting of forms of identity (or age) verification in pursuit of porn, and that once it is established that AV will likely ratchet.
Then Pandora Blake, who has already captured her perspectives very well in a couple of blogposts which I will not recap, but with themes touching upon the highly disproportionate regulatory costs to be borne by small UK Porn SMEs, and how some above-R18 material (R = restricted, e.g.: fetish) will become illegal to show online at all.
Afterwords — Final Recommendations
On the back of everything that I have heard, I consider the Age Verification requirements which are promulgated in the Digital Economy Bill to be:
- illiberal — in that they remove freedoms and chill communications
- ineffective — in that it must by both design and intention be either circumventable or oppressive.
- misconceived — in that they are a doubly-indirect technical bodge that attempts to address a social challenge (enable secure proving mechanisms, to verify age, to inhibit -18 people from accessing porn, to protect the young from accessing porn)
- harmful — in that they will educate a large number of Britons into risky information security practices, in pursuit of a narrow (and trivial) goal
- dangerous — in that solutions that seek to address this challenge will create attractive centralised databases of sensitive data which need not otherwise exist.
Given this, I would recommend the excision of AV from the Digital Economy Bill, to be reconsidered and submitted instead under a separate bill.
Absent that, I would amend the following section:
23 Exercise of functions by the age-verification regulator
(1) The age-verification regulator may, if it thinks fit, choose to exercise its powers under sections 20 and 22 principally in relation to persons who, in the age-verification regulator’s opinion —
(a) make pornographic material or prohibited material available on the
internet on a commercial basis to a large number of persons, or a large
number of persons under the age of 18, in the United Kingdom; or
(b) generate a large amount of turnover by doing so.
…replacing the trailing “or” in section (a) with “and”, in order to relieve the regulatory burden on small Porn SMEs. Section (a) is of itself meaningless, because any document on the Internet is inherently “available to…a large number of persons” — several billion at last count.
Other recommendations are spattered throughout this text.
Thank you for reading.
- Chris Ratcliff, for hosting everyone
- Wendy Grossman, for notes; see also Wendy’s write-up at net.wars
- Myles Jackman and Pandora Blake, for context and insight
- An anonymous QSA, for acting as a PCI sounding board and also for letting me plagiarise wildly from various rants
Errata and Changelog
- rephrase: “a high price would be paid…”
- links: added wikipedia links for XSS and CVV
- clarify: upon AV providers where they leak sensitive
- rephrase: …technical bodge that attempts to address…
- rephrase: …in that they will educate a large number of Britons into risky information security practices, in pursuit of a narrow (and trivial) goal…
- rephrase: …solutions that seek to address this challenge will create attractive centralised databases of sensitive data which need not otherwise…
- rephrase: ban upon indirect monetisation or derivative use
- added: one of these mechanisms seeks to leverage …
- added para: Gaming solutions seek to be non-circumventable…
- amended: whether available technological solutions are fit for one’s goals
- added: … prevent “inference” from being shared as “fact”.
- rephrase: physically possess
- added: Also: fake websites harvest…
- amended & extended: Do we need yet another e-currency? Not all…
- minor typos
- add banner image
- added: let alone “insider” risks