There’s no way to stop children viewing porn in Starbucks

With all the discussion (FUD) that is currently going on about “porn” filters we felt that it is important for people to fully understand some of the issues involved.

Cory Doctorow is a much respected author, blogger and activist.

This article was first printed in the Guardian on the 13th November 2012.

An mp3 (podcast) version can be found Here.

Last week’s debate in the Lords on the proposal to stop opt-out pornography filters was a perfect parable about the dangers of putting technically unsophisticated legislators in charge of technology regulation.

The Lords are contemplating legislation to require internet service providers and phone companies to censor their internet connections by default, blocking “adult content,” unless an over-18 subscriber opts out of the scheme.

On its face, this seems like a reasonable plan. When I wrote to my MP, Meg Hillier, to let her know I objected to the scheme, she wrote back to say that despite the imperfections in a porn filter, it was better than nothing because kids would always be more sophisticated than their parents when it came to internet technology. The last part is mostly true, but the first part is a nonsense.

In order to filter out adult content on the internet, a company has to either look at all the pages on the internet and find the bad ones, or write a piece of software that can examine a page on the wire and decide, algorithmically, whether it is inappropriate for children.

Neither of these strategies are even remotely feasible. To filter content automatically and accurately would require software capable of making human judgments – working artificial intelligence, the province of science fiction.

As for human filtering: there simply aren’t enough people of sound judgment in all the world to examine all the web pages that have been created and continue to be created around the clock, and determine whether they are good pages or bad pages. Even if you could marshal such a vast army of censors, they would have to attain an inhuman degree of precision and accuracy, or would be responsible for a system of censorship on a scale never before seen in the world, because they would be sitting in judgment on a medium whose scale was beyond any in human history.

Think, for a moment, of what it means to have a 99% accuracy rate when it comes to judging a medium that carries billions of publications.

Consider a hypothetical internet of a mere 20bn documents that is comprised one half “adult” content, and one half “child-safe” content. A 1% misclassification rate applied to 20bn documents means 200m documents will be misclassified. That’s 100m legitimate documents that would be blocked by the government because of human error, and 100m adult documents that the filter does not touch and that any schoolkid can find.

In practice, the misclassification rate is much, much worse. It’s hard to get a sense of the total scale of misclassification by censorware because these companies treat their blacklists as trade secrets, so it’s impossible to scrutinise their work and discover whether they’re exercising due care.

What’s more, many of the people whose content is misclassified are not in a position to discover this (because they are operating a website from outside the country and can’t tell if they’re being blocked unless a reader in the UK tells them so), and the readers whose requested documents are blocked are often children, or technical naifs, neither of whom are likely to know how to effectively complain about their blocks.

But it’s instructive to look at the literature on overblocking. In 2003, the Electronic Frontier Foundation tested the censorware used by US schools to see how many of the most highly-ranked documents on concepts from the national school curriculum were blocked by the school’s own censorware. They discovered that 75-85% of these sites were incorrectly classified. That percentage went way, way up when it came to sensitive subjects such as sexuality, reproductive health, and breast cancer.

That study is a decade old, and dates from a time when the web was comparatively minuscule. Today’s web is thousands of times larger than the web of 2003. But labour isn’t thousands of times cheaper, and good content has not gotten thousands of times easier to distinguish from bad content in the interim.

So when Lady Benjamin spoke of wanting to protect “students doing research for a project”, she should have also been thinking of protecting students’ access to the documents necessary to do their homework. I just returned from a tour with my new young adult novel, Pirate Cinema, that had me visiting schools in 18 cities in the US and Canada.

Over and over again, teachers and students described the problems they had with school censorware. Not only did their networks block the wrong thing, but what was blocked changed from minute to minute, making it nearly impossible to integrate the internet into curriculum and presentations.

Teachers told me that they’d queue up a video to show to their afternoon class, only to have it incorrectly blocked over the lunchbreak, leaving their lesson plan in tatters, with only minutes to come up with an alternative. Students who had come to rely on a site for information related to a project returned to those sites to confirm something, only to discover that it had been put out of reach by a distant, unaccountable contractor working for the school board.

But it’s not just “good” material that gets misclassified. There is unquestionably a lot of material on the internet kids shouldn’t be seeing, and it multiplies at a staggering rate. Here, you have the inverse of the overblocking problem. Miss 1% – or 10%, or a quarter – of the adult stuff, and you allow a titanic amount of racy material through. Students who are actively seeking this material – the modern equivalent of looking up curse words in the school dictionary – will surely find it. Students who are innocently clicking from one place to another will, if they click long enough, land on one of these sites. The only way to prevent this is to block the internet.

So when Lady Massey complains that Starbucks has failed its civic duty by allowing unfiltered internet connections in its cafes, she’s rather missing the point. Even a filtered connection would pose little challenge to someone who wanted to look at pornography in a public place. Meanwhile, any filter deployed by Starbucks will block all manner of legitimate material that its customers have every right to look at.

The only way to stop people from looking at porn – in print or online – in Starbucks is to ask them to leave if you see them doing it.

So far, I’ve been writing as though “adult” content can be trivially distinguished from “family friendly” content. The reality is a lot muddier. There’s plenty of material in a day’s newspaper that I wouldn’t want my four-year-old daughter to hear about, from massacres and rapes in Mali to the tawdry personal stories on the agony aunt page. Sorting the “adult” internet from the “child” internet is a complex problem with no right answer.

But you’d never know it, to listen to the censorware vendors. When Boing Boing, the website I co-own, was blocked by one censorware site, it was on the grounds that we were a “nudity” site. That was because, among the tens of thousands of posts we’d made, a few were accompanied by (non-prurient, strictly anatomical) small images showing some nudity.

When we argued our case to the vendor’s representative, he was categorical: any nudity, anywhere on the site, makes it into a “nudity site” for the purposes of blocking. The vendor went so far as to state that a single image of Michelangelo’s David, on one page among hundreds of thousands on a site, would be sufficient grounds for a nudity classification.

I suspect that none of the censorship advocates in the Lords understand that the offshore commercial operators they’re proposing to put in charge of the nation’s information access apply this kind of homeopathic standard to objectionable material.

When Lady Howe compares mandatory, opt-out censorware to a rule that requires newsagents to rack their pornographic magazines on the top shelf, I doubt she would stretch her analogy to saying, “If a magazine comprising a hundred thousand pages has a single picture of Michelangelo’s David anywhere in its pages, it must have a brown paper cover and be put out of reach of children.”

Such an analogy is a nonsense. We don’t have magazines with a hundred thousand pages. We don’t have town halls or round-tables with millions of participants, all speaking at once. We don’t have cable or satellite packages with a billion video channels. Though it is sometimes useful to draw analogies between the internet and all the media of yesteryear, it’s important not to lose sight of the fact that these are only analogies, and that the only thing that the internet can properly be compared to is the internet.

But back to censorware, and the vendor who censored my site. That company and its competitors are pretty unsavoury. The company does most of its business with Middle Eastern dictatorships, who rely on its software to stop its citizenry from reading the world’s free press. The whole censorware industry’s bread and butter is selling software that despots use to commit, and cover up, human rights abuse.

These are the firms that we’re proposing to put in charge of the nation’s information. What’s more, as I’ve written in the past, internet censorship cannot be separated from internet surveillance. If these companies are to stop us from seeing the pages they’ve blocked (according to their own secret, arbitrary criteria), they have to be able to look at all the pages we request, in order to decide which page requests may be passed through and which ones may be blocked.

What’s more, as Hillier was quick to remind me, kids are pretty technologically sophisticated. Kids whose parents rely on this filter will discover quickly that their kids have no trouble evading it. As I discovered on my tour, kids know how to search for open proxies on the internet and to use them to beat the filters. Unfortunately, by driving kids to these filters, we put them at risk of surveillance from another group of unaccountable strangers, as the proxy-operators have the ability to see (and even to interfere with) the kids’ traffic.

Filtering isn’t better than nothing. It blocks unimaginable mountains of legitimate content. It allows enormous quantities of the most ghastly pornography through. It gives unaccountable, sleazy censorware companies the ability to spy on all of our online activity. It doesn’t help parents control their kids’ online activities.

Lord Swinfen says this is a plan to “ensure children do not accidentally stumble on pornography”. He says that its opponents oppose all internet regulation. But really, this proposal isn’t about preventing kids from seeing porn, it’s about slightly reducing the likelihood that they will. Opposing this law isn’t about opposing all internet regulations, just bad ones that do almost no good and enormous harm.

I don’t have a good plan for stopping kids — even my kid — from seeking out bad stuff on the internet. I sit with her when she uses the net (something I can do while she’s small and doesn’t have her own phone) and help her understand how to navigate the net, and what I expect from her. I can only hope that this will be sufficient, because there is no filter, no app, no parental control, that will stop her from getting access to bad stuff if she really wants to. That’s the reality of things. If the Chinese government – with its titanic resources and armies of network engineers – can’t stop its citizens from seeing the things it wants to block, what hope do I have?

I know that my position would be stronger if I could offer an alternative to censorware. But such an alternative would be wishful thinking. When the government decides that it is its job to stop kids from accidentally or deliberately seeing adult material on the internet, it is tantamount to putting out a call for magic beans, and announcing that the budget for these magic beans is essentially unlimited. As we’ve seen before – with NHS IT integration, the national database, and other expensive and wasteful IT boondoggles – there is no shortage of unscrupulous sorts in nice suits who’ll offer to sell you as many magic beans as you’re buying. That doesn’t make magic beans any better at solving the nation’s problems. The fact that someone will sell you a solution to the great, intractable problems of the modern age doesn’t mean that it will work.

Source: Guardian online, CrapHound.com

Contact Us to book an appointment or to discuss your needs further.

Comments are closed.