Blocked! How filters censor the Internet
Some notes from the UK’s biggest digital rights conference, this year focussing on government surveillance
Saturday, 15 November, 2014: Several hundred people interested in digital rights congregate at King’s College London’s Waterloo campus, for the Open Rights Group’s 2014 conference. I am proud to have been elected to the board of ORG in 2013, having been a founding member of the Advisory Council, so I figured I should share some of my notes from the conference.
I’ve already written about Cory Doctorow’s opening keynote, but there were many other sessions across the day. To avoid these pieces becoming too unwieldy, I’m covering individual sessions in each piece. I didn’t attend this session myself, so apologies for the somewhat sketchy write-up, based mainly on the notes and slide deck from Richard and Ruth but omitting any record of the Q&A discussion.
While I was in the session “Surveillance, whistleblowing and the media”, ORG’s project manager Richard King and Ruth Coustick-Deal, our supporter officer, were revealing the results of Blocked!, our project that’s helped to expose the censorship caused by filters that aim to prevent children from seeing adult content.
Richard and Ruth mentioned ORG’s Department of Dirty site and he talked about the Blocked! Quests, which were designed for Mozilla Festival to challenge people’s thinking and help people learn about the issue of web filtering in the UK. They started out by explaining about the different kinds of content that is censored in the UK:
Criminal content: Child abuse is abhorrent, so arguments in favour of censoring images of this abuse are the most easily defended. Similarly, the day before ORGcon, Downing Street announced that terrorist material will also be blocked. As Amelia Andersdotter, a speaker on one of our later panels, tweeted (in Swedish), one of the problems we face here is the lack of transparency over web filtering — there is no oversight around blocking and no review process:
We have a separate campaign on this topic, called Error 451. Also there’s plenty of doubt over the effectiveness of blocking, rather than removing content at source, for example:
But this is not the focus of our Blocked! campaign.
Intellectual property: The blocking of websites that facilitate copyright, trademark, or other forms of civil infringement is controversial. That said, at least where these blocks exist in the UK they have been imposed by a court after due process. But we think there could be better safeguards for consumers, such as time limits, minimisation of collateral damage and making what has happened visible to the people who try to access these sites.
‘Undesirable’ content: Filters affecting legal content are the least defensible form of censorship in the UK. At best, they’re actively chosen by an individual and affect only that person. At worst they can be used by malicious actors to enforce their world-views on others and deny people access to information.
Legal-content filters extend from censoring the extreme to the esoteric. And that’s assuming they work as advertised. But they don’t. All sorts of legitimate websites get blocked in error while many sites of the type casual users might expect to be blocked are let through.
That’s why legal-content filters are the focus of our campaign. ORG’s campaign includes the Department of Dirty:
The government wants us to think in Manichæan terms, with “good” websites and “bad” websites and filters that are able to distinguish between them; no pesky grey areas getting in the way. David Cameron announced “one click to protect your whole home and to keep your children safe” and that’s what parents are being sold by ISPs nodding along with him. But filtering is more complex than that.
“Good” sites get filtered out — this is “overblocking” — and “bad” sites get let through. Filters cost ISPs money and they have little incentive to strive for excellence; they outsource the job to companies abroad and so they can blame someone else for the failures.
We know they don’t work as well as industry and the government claim. They just block a bunch of sites, some of which are good, some of which are bad, and all of which are legal. But the promise of “one click to safeguard your family” can lead care-givers into complacency. We’ve flicked the magic safety switch! All is well! No more parenting needed!
People tend to accept defaults; the ‘nudge theory’ that the government uses to try to influence our decisions and behaviour takes that as a given. Encouraging everyone to accept adult Internet filters means millions of adults will lose access to all sorts of material rightly or wrongly categorised inappropriate for under-18s; mobile companies just assume you’re a child.
There are several particular problems with web filtering:
- Although Ofcom and the BBFC are both now involved, there is little meaningful oversight of website classification. It’s just left to the judgement of the largely foreign companies to whom filtering is outsourced.
- It’s difficult to get sites unblocked; if you complain to your ISP and the typical response is “would you like us to switch filtering off”? This solves the problem of misclassification for neither the customer the nor site owner.
- Each network’s filters work differently so independent monitoring is hard.
- If you own a website, how do you find out if it’s being blocked? ISPs won’t tell you!
That’s why we built www.blocked.org.uk. Based on free software hosted on Github and built by ORG’s volunteers, the website allows you to type in a URL and check whether or not it is blocked by any of the UK’s mobile networks and most of the UK’s home broadband networks. The site also collects reports of filtering failures and advises on how to correct inappropriate blocks. The site is sponsored by Bytemark and Andrews & Arnold, who provide not only hosting space but also subscriptions to the main UK ISPs so that we can perform these checks.
Ruth expanded on our progress so far: When she put together the stats for her talk, we had checked 203,636 websites on Blocked.org.uk (there will have been more since, obviously).
Nearly 20,000 (20%) of websites are blocked by strict filters; nearly 10,000 (10%) are blocked by default filters. The four largest ISPs in the UK, who between them have 95% of the market, block around 12% of the Alexa top 100,000 sites from their users.
(These are figures from the Alexa top 100,000, not from our Blocked.org.uk stats, hence the percentages don’t add up against those stats. We ran tests on the Alexa top 100,000, putting the sites through our probes; this is more useful data than the information from Blocked.org.uk, as that dataset is self-selecting, rather than objective information.)
Mobile ISPs filter around 5% of those Alexa top 1,000:
Broadband ISPs tend to block more:
There are a couple we surveyed, however, who scored much better than the rest:
The stand-out difference is that Plusnet blocked four sites — and all of those are Pirate Bay domain sites. Why is this — because they don’t have a filter and the Pirate Bay is blocked by court order, rather than subject lists produced by some company.
Why are the results so different between mobile networks and broadband ISPs?
The BBFC set benchmarks for mobile providers. They classify films through surveying what the general public find offensive. They are still subjective but at least they act as an independent adjudicator. The phone networks pay the BBFC to adjudicate on what should or shouldn’t be blocked when there are disputes that “my site shouldn’t be blocked”.
But the main ISPs block whole categories rather than content — and some categories are extremely broad:
In case you are merely human and find that image confusing and difficult to read, here are some interesting points about BT’s filtering options:
- The category “obscene and tasteless” includes “bathroom humour” on a par with “criminal activity” and is blocked by default. This category also blocks “sites with information about illegal manipulation of electronic devices, hacking, fraud and illegal distribution of software”.
- The category “drugs” includes merely “information on illegal drugs” and is blocked by default.
- Sites selling alcohol & tobacco are blocked by default — because we have such a problem with kids downloading booze and getting drunk on the web(!)
- With moderate blocking, you can block non-sexual nudity — but kids can look at their own bodies in real life.
- The social networking, file-sharing, media streaming and search engine categories are censorship by form not by content.
- The sex education category includes “respect for partner”, which we understand to be a euphemism for domestic violence.
We think there is some danger in these categories being available for people to apply — they are too broad for most young users. See more information on the ORG wiki at wiki.openrightsgroup.org/wiki/BT_Parental_Controls
Also, from an email conversation with BT, in addition to the selectable categories, “all customers opted into Parental Controls are prevented access to sites promoting the use of proxies and anonymisers”.
Games are a major category, yet games are largely for kids, the most-played games are for young people — Minecraft and Moshi Monsters have been topping the charts all year — and their take-up is for under-tens. Blocking by form, rather than content, simply doesn’t make any sense.
What else have we learnt anecdotally?
We were particularly concerned that because many charities work to respond to problems or issues surrounding drugs, sex education, smoking, mental health and abuse would contain the same words as sites that promote those things; we built lists of charities and ran them through our filter.
On a sample, we found that at least 54 Scottish registered charities have websites which are blocked by one or more of the main UK ISPs. These include Aberdeen-based Alcohol Support, a Dundee equalities project called Different Visions Celebrate that works with under-25s “who have any issues or concerns due to their sexuality or the sexuality of a family member” and the Say Women project in Glasgow which offers “safe, supported accommodation and related services for young women, aged 16–25 years, who are survivors of childhood sexual abuse, rape or sexual assault and who are homeless or threatened with homelessness”.
We’ve even seen such dangerous content like “feminism” being blocked — in July 2014, mobile operator Three even blocked a Jezebel piece on maternity leave.
So what policy recommendations have we drawn from this project?
- It shouldn’t be us. Awesome as this tool is, we’re doing it because filters are a form of censorship.
- When something is banned we should know why. It should be clear what’s going on and at the moment ISPs refuse to provide a tool like this.
- If there is no legal framework, how does a wrongly blocked site get unblocked?
Richard and Ruth’s slide deck is available online:
The project is far from complete and there are loads of exciting directions in which we could take it next. But to do that we need your help. If you are a coder, documenter, website owner, educator, graphic artist, videographer, speak more than one language or simply have a great idea for this project, then please get involved!
ORGcon 2014 was generously sponsored by F-Secure and Andrews & Arnold Ltd. The Open Rights Group exists to preserve and promote your rights in the digital age; we are funded by hundreds of people like you.
This article is dedicated to the public domain under the terms of the Creative Commons Zero licence. Please translate, copy, excerpt, share, disseminate and otherwise spread it far and wide. You don’t need to ask me, you don’t need to tell me. Just do it!