Subscribe

UK govt sidles towards censorship

Jon Tullett
By Jon Tullett, Editor: News analysis
Johannesburg, 01 Aug 2013
The UK is joining a growing list of countries which enforce online filters.
The UK is joining a growing list of countries which enforce online filters.

The UK government, under the leadership of David Cameron, is on the brink of mandating nationwide censorship, requiring Internet service providers to filter, block and report on all citizens' browsing habits, starting with pornography. The UK will join a growing number of countries which actively censor the Internet, including most Arab states, China, and Australia.

But, not yet SA. Not for lack of trying - SA ventured down this road, with Malusi Gigaba's aborted attempt to push through Internet filtering in 2010. Governments around the world demonstrate that failure to pass an unpopular measure just means you go away and try to push it through another time, so we'll probably see future attempts to implement similar filters, especially as other Western nations go down the same path, setting useful precedents for their political colleagues abroad.

There are several similarities between Cameron's filters and Gigaba's, as well as the Australian initiatives. They make many of the same mistakes, stem from similar sources, and often conceal the real motives.

If at first you don't succeed

In SA, commentators at the time noted that the proposed anti-porn Bill was likely unconstitutional, and it failed to progress. Other nations setting precedents for filtering will give new weight to the inevitable next attempt, which could well succeed if it avoids the wider-reaching parts which raised constitutional concerns last time. In the UK, attempts to automatically filter pornography have been rejected in the past, but that never stops the lobbyists. Cameron may this year succeed where the government was previously thwarted by the will of the people.

As the UK and Australia have shown, the important step is the first one. Any filter is a start, from which governments can expand later, especially since the actual filter lists are kept secret. In Australia, for example, the filter was intended to block only illegal sites, but when the list was leaked to WikiLeaks, it was found to contain many unusual entries, such as dentists, legitimate gambling sites, and discussion forums. The list was quickly cleaned up by ACMA, the Australian Communications and Media Authority, but questions of oversight linger. Ironically, the list also included WikiLeaks itself, meaning Australia had a small list of countries, including China and the UAE, as regimes blocking access to the site.

In the US, repeated attempts to push through censorship legislation culminated in SOPA, which was rejected early in 2012. It'll be back, probably as PIPA (the Protect of Intellectual Property Act), itself a rewrite of COICA, which was in turn rejected in 2010. They're a persistent lot, these lobbyists.

Outside of the US, most filtering laws are proposed under the guise of protecting people - the youth, in the case of the UK and Australia, or the public as a whole in the case of SA, China, and most Arab countries.

Bills are proposed with the assistance of moral activist groups, usually with religious foundations, focused on enforcing moral standards. Outside of orthodox Muslim nations, where content restrictions are usually decided and enforced by religious authorities, these proposals usually seek to extend the scope of the law in line with their thinking - in Gigaba's case, that was JASA, the "Justice Alliance of SA". In the UK, the central report promoting porn filtering was sponsored by Premier Christian Media, in partnership with Safer Media, a religious group which includes as its aims to curb the availability of content it deems "harmful", including violence, pornography, bad language, anti-social behaviour, and drugs. If the group continues to exert influence over Cameron and his advisors, it is unlikely to end with a ban on pornography. Signs are already there: the alleged template of blocked material sent to ISPs includes "extremist material", "Web forums", "alcohol" and "smoking" as material which should be blocked by default.

Internet users who do happen to like pornography aren't campaigning to force everyone else to watch it.

JASA and similar bodies around the world usually make the same mistakes: assuming their moral standards should be enforced on everyone regardless of like-mindedness, thinking that filters will work, and making questionable claims about the impact and scope of "immoral" content, and hoping that censorship will improve matters. Unfortunately, those mistakes don't prevent them from gaining the ear of politicians, even to the extent of passing laws. And in some cases, they can make matters worse, despite their good intentions.

Sauce for the goose?

Although their intentions may be honourable, it's not clear why these groups think banning legal, but distasteful, material is acceptable. Internet users who do happen to like pornography aren't campaigning to force everyone else to watch it, after all. If you don't want to watch porn, don't! If you don't want your dependents to watch porn, then implement your own personal filter, either on your network at home or via your ISP. Ask 10 different people what constitutes legal but offensive material, and you'll get 10 different answers. Why should one of them get to enforce their view on the other nine? It was this, in fact, which led to constitutional challenges to JASA's proposals: the South African Constitution is explicitly non-denominational.

If at first you don't succeed...

A selected history of US federal laws including Internet filtering, blocking or censorship. Several more, including state laws, are omitted.
1996: Communications Decency Act (CDA) passes, ruled unconstitutional in 1997
1998: Child Online Protection Act passes. Repeated challenged on constitutional grounds, permanent injunction in 2009
1998: Digital Millennium Copyright Act (DMCA), including ban on discussion of circumvention technology
2000: Children's Internet Protection Act (CIPA), requiring schools to filter Internet access for children
2006: Deleting Online Predators Act (DOPA) fails to pass
2007: DOPA fails to pass again
2007: And again
2008: Trading with the Enemy Act, includes blacklist of sites which registrars must block
2010: Protecting Cyberspace as a National Asset Act fails to pass
2010: Combating Online Infringement and Counterfeits Act (COICA) fails to pass
2011: Stop Online Piracy Act (SOPA) introduced, including sweeping censorship powers. Politically well supported until mass action forces postponement in 2012
2011: Protect Intellectual Property Act (PIPA) proposed. Postponed after mass action
2011: Cyber Intelligence Sharing and Protection Act (CISPA), includes blocking and censorship powers. Rejected by Senate
2013: CISPA reintroduced
Source: Wikipedia

The same argument extends to national censorship: there are nations where accessing politically disruptive or religiously offensive material is illegal, and they censor it. As outside observers, we may disapprove of this, but we don't get to enforce our preferences on them, nor vice versa.

Of course, that argument gets messy when the regime in question is outright abusive. Internet censorship was used to stifle protesters during the Arab Spring, for example. The slippery slope is very much in evidence: today's high-minded filter is tomorrow's tool of repression.

Annoying, expensive, futile

Anyway, it won't work. All the filters in the world can only succeed to a limited degree. At best, banned material can only be driven underground, where the dedicated already know to look. Paedophiles have long operated closed networks to swap images, piracy has continued to grow despite the copyright industry's best efforts, and the darknet, with communities like Silk Road operating out of the encrypted Tor network, are almost impossible to shut down or police.

Proponents of the plans would argue that is not the point: the goal is to prevent children from accidentally straying into offensive material. Cameron thinks the mere sight of run of the mill pornography will pervert innocent young minds, but at the more realistic end, there is a case to be made for shielding children from the extreme (and illegal) depictions of gore, rape, and sexual abuse available online.

Search engines' roles in delisting such pages are an effective way to reduce accidental visits, and probably the only meaningful move. Search engines like Google can play a role by removing objectionable material from their indexes. In rare cases, these companies can take a more active role in policing content, which is almost universally illegal or repugnant, such as malware or child pornography.

But accidental visits are about all you can reduce - children are natural hackers and extremely adept at finding things you want to keep hidden. Teenagers were enjoying smuggled porn before the Internet, after all, and will continue to do so no matter how much we try to filter it. It doesn't take a lot of research to turn up the existence of VPNs, proxies, and Tor. John Gilmore, Internet activist and one of the founders of the Electronic Frontier Foundation, famously said: "The Internet interprets censorship as damage and routes around it," a statement no less true today than in 1993.

The nanny statesmen can implement all the filters they like, but the porn and the perverts will still be there.

Far more dangerous than any pornography is the reality of online abuse, running the gamut from bullying to paedophile grooming, none of which will be affected one iota by automated filters. The nanny statesmen can implement all the filters they like, but the porn and perverts will still be there. The final safety net has to be the watchful eye of the parent, not a mechanical filter.

Meanwhile, the filters will serve only to annoy Web users who don't want to be filtered but find themselves caught up by it - I've seen first-hand how medical editors despair when Web filters block access to research material about sexual health, for example. Many broad sites, such as discussion forums and chat rooms which may include adult material, will have to be blocked in toto. Legitimate material may or may not be affected - Cameron has admitted he doesn't know whether explicit literature like '50 Shades of Grey' should be blocked or not, for example. There are so many fringe cases - "can of worms" doesn't even begin to describe what this will entail.

Smoke and mirrors

But, most of the arguments for and against are moot, because except for a na"ive but well-intentioned few, most politicians aren't really thinking of the children at all. Internet filters don't work - are known not to work - but that's not the point. Filters are a way of monitoring and controlling Internet users, and in most cases the true drivers are political control and corporate protection (usually of the copyright variety - in the first half of 2013, Google had processed an astonishing 100 million takedown requests, almost all aimed at pirate movies, music and software).

At best, banned material can only be driven underground, where the dedicated already know to look.

Under Cameron's UK plans, for example, the filter will be actively applied, even to users who opt out - their browsing activities will still be recorded by the filter even if nothing is blocked. Although the government's stated goal is to implement default filtering for all Internet users, ISPs and users have resisted that in favour of opt-in filters (effectively little different from the filtering services many ISPs already offer). Some confusion now circulates which side will claim victory: although the government has declared the system will be implemented by default, leaked correspondence with service providers suggest the government is playing an odd shell game: allowing ISPs to operate an opt-in system, but encouraging them to describe it as the opposite ("default on") to their customers.

The Chinese connection

For the more paranoid observers, a deeper question also lingers about Huawei's involvement, and whether it is appropriate for the UK government to turn its national infrastructure, and the browsing habits of the entire nation, over to a foreign entity. Huawei has been accused of building back doors into its telecoms gear to allow the Chinese government access to telecoms networks around the world, along with more general criticism of its equipment's security. The firm has hotly denied these claims, and no concrete evidence has been presented. Of course, US tech firms denied opening their networks to US government agencies too, but even if Huawei is completely innocent of any state involvement or other vulnerability, the deeper ramifications of the UK's Internet censorship remain.

The filtering system of choice was pioneered by TalkTalk, a UK-based ISP and telecoms provider, which has been working on filtering technology since at least 2010 - in that year it faced customer outrage when its clandestine Web-tracking was first revealed, after users noticed visits from TalkTalk-controlled IP addresses immediately after they had visited sites. TalkTalk disclosed it had partnered with Huawei, capturing all customer URLs and comparing them to white- and black-lists in real-time. MD Clive Dorsman defended the system, saying it didn't capture personally identifiable data, but as AOL found out the hard way, in 2006, even anonymous data can reveal a great deal of information about users, emphasising the concern about a government-run filter, which records even users who have opted out.

Looking ahead

It is clear there is steady pressure across the globe to filter the Internet. Western countries used to claim the moral high ground against the likes of Saudi Arabia and China, but we are joining them, one step at a time.

Some of the pressure is legitimate - law enforcement is battling to come to terms with the online world, content providers are losing a dreadfully expensive war against piracy, and there is plenty of truly despicable material in the darker corners of the Web.

Privacy and human rights activists emphasise the risks, but the reality is that most nations have implemented, or are trying to implement, some form of online filtering. It won't work, but it will change the Web, and we will change with it.

Share