Image for Are mass content filtering rules legal? 
Avatar image of Ruth Coustick-Deal

Are mass content filtering rules legal? 

EU legal experts checked the proposal for mass content filtering to see if it was legal, but vague wording means too much trust in big business.

Are proposals for mass content filtering in the EU even legal? That's what several EU nations have just asked after reviewing the controversial draft copyright laws.

Belgium, the Czech Republic, Finland, Hungary, Ireland, and the Netherlands suggested that it might infringe on fundamental rights laws, on the right to run a business, and on existing laws against monitoring people online. The measure they are discussing is the plan to introduce rules that make a system like YouTube’s ContentID mandatory, wherever users can upload their own content. 

This is a really serious issues that thousands of people have also been raising in the Save the Link campaign.

The questions from these states went to the European Council Legal Services, a body which then examines them in detail, provides answers, and suggests a way forward

We got their answers last week: They say that the law does not explicitly demand mass filtering and censorship filters. However,there are many ‘ifs’ and ‘maybes’ and assumptions they are forced to make, because the rules that could lead to the creation of mandatory content filters are purposefully vague.

The advice given by the Council Legal services certainly shows that they trust in the letter of the law. But the spirit in which it's implemented may turn out to be much different than their rosy view.

For example, in the Legal Service’s answers they say that businesses (like Reddit, Tumblr or Github) will get to choose which “technical measures” to install, and those measures must be “appropriate and proportionate”, ergo they will be.

It also says that these restrictions on content sharing will only apply to content “identified by rightsholders”, so we don’t have to worry about lawful content being blocked.

However, this assumes that powerful rightsholders will not use this power to stop legal uses of their content that they don’t want seen.

The actual behaviour of the big rightsholder organisations gives us many examples of them censoring legitimate content. Real-life stories include copyright being used to censor oil prices; censor reviews, and content filters literally censoring a lecture about copyright. This law will enable even more if these bad corporate practices.

The review eventually accepts that users making parodies or sharing content that uses copyright in a legal way (ie. game reviews, online journalism) might be affected, “in an unjustified manner”. Yet it concludes that so long as there is a redress mechanism for content that should never have been blocked, it would be fair.

Getting content unblocked later can miss its relevancy and urgency. It is often lengthy, costly, time consuming, and confusing to go through a redress system just to publish legal content. The kind of system being talked about by rightsholders is one which would prevent certain users from publishing certain types of content from, rather than face having it taken down afterwards through a fair process. (For example, Canada’s notice & notice system)

They describe this as “adequate”.
We describe this as a censorship machine.

Simply put, when the law is vague, making assumptions about how corporations might act isn't enough. How it will be implemented, whether there will be mass surveillance involved, what compensation people could face: it’s all unknown.
The best legal experts have spoken, “The confusing terms under which the recital is drafted raise various legitimate questions to which, regrettably, no clear answers are given”

When vague laws are passed, it lets those in power determine the meaning - after the fact.

We hope that the law can be changed in such a way to address its vagueness and put defending our right to free expression at the heart of the law.


Take action now! Sign up to be in the loop Donate to support our work