Image for Content filtering: illegal, unpopular, and broken.
Avatar image of Ruth Coustick-Deal

Content filtering: illegal, unpopular, and broken.

Decision makers are looking to content filtering and algorithms as the solution to all social problems manifested on the web. But they are not the easy fix that they are presented to be.

The EU is winding down for the winter break, but the fight against censorship hasn’t taken a pause.

The European Commission (the EU body who writes the laws) is trying to push for default filters to block and hide all the problems of the Internet.

In the name of defending copyright, the Commission proposed to introduce the implementation of mandatory filtering, to block content before it is posted. This is what is put forward in Article 13 of  the Copyright in the Digital Single Market Directive.

Although the Commission have seen the backlash and serious criticism from the academic community, that doesn’t seem to have stopped them believing in this system. They have recently been proposing filtering as the answer to not just copyright, but terrorism, “fake news”, bullying, and hate speech.

The Commission’s views are put across in a document “Communication on Tackling Illegal Content Online - Towards an enhanced responsibility of online platforms.

“Online platforms should, in light of their central role and capabilities and their associated responsibilities, adopt effective proactive measures to detect and remove illegal content online and not only limit themselves to reacting to notices which they receive. ”

However, the existing laws against filtering are there for many reasons – we’ve produced a beautiful FAQ guide with all the details about what’s wrong with these filters. Essentially: they cannot recognise legal uses of content, or understand context, such as videos of terrorists within a news segment, they over-block, and they put huge power in the hands of already powerful corporations. It is also expensive and overly broad –hitting even non-profits like Wikimedia posting factual information. They believe that anywhere users create, they must be controlled.

Furthermore, European Union law prohibits mass ‘monitoring’ – which must take place if there is going to be any kind of content filtering – the companies will have to monitor what people post.

The Commission think they can get around this restriction, by suggesting that platforms do all this filtering “voluntarily”. Our friends at Electronic Frontier Foundation have a name for this kind of thing – Shadow Regulation.

Copyright expert and journalist Glyn Moody recently wrote an article explaining how making these filters “voluntary” doesn’t guarantee legality either. This is because the new data protection law, the GDPR, says that companies cannot use filters and algorithms to make automated decisions about us AKA automatically blocking posts from being uploaded.

He concludes that,

“Article 13’s automated general upload filters are either voluntary, in which case they are illegal under the GDPR, or they are mandatory, and therefore illegal under the E-commerce Directive. There’s no other possibility. What’s clear is that upload filters are illegal in all situations, and must therefore be dropped from the Copyright Directive completely.”


The voices against censorship machines are multiplying

Led by Marietje Schaake, 30 MEPs responded to the Commission’s document, slamming this push for robots in charge of content. They signed an open letter on December 5th saying:

“The EU’s policies in this area resonate worldwide. A clear focus on preserving fundamental rights, enhancing transparency, and limiting the privatization of content removal decisions, is essential to create a legal framework that does not lend itself to abuse within the EU, and which does not provide an excuse for countries outside the EU to legitimize or vindicate questionable laws that suppress freedom of speech.”

Recently OpenMedia joined 80+ organisations in sending perhaps the world’s shortest open letter to the European Council, regarding the use of filtering mechanisms for copyright. It simply says this:

“We write to you to share our respectful but serious concerns that discussions in the Council and European Commission on the Copyright Directive are on the verge of causing irreparable damage to our fundamental rights and freedoms, our economy and competitiveness, our education and research, our innovation and competition, our creativity and our culture.”

We didn’t forget about the P.S. note though! It then points to 29 previous statements from experts, representing hundreds of education institutions, libraries, businesses and human rights organisations, who have already spoken up to point out the serious negative impacts that their proposal for censorship machines will have.

We have to be very wary because these proposals, for automated filtering of content, are not just in the copyright proposal in Europe. They have also been proposed in the United States, and in the UK, most recently.

The real harms the web has also brought us, such as harassment and spreading hate speech, are not going away. And to pretend that mass content monitoring and filtering are the way to solve these known harms is to be willfully neglecting the facts.

We will continue to work with experts and civil society organisations to ensure that governments do not take the easy route and outsource speech laws and policing to bots everywhere.


Take action now! Sign up to be in the loop Donate to support our work