Image for Explaining Bill C-63, The Online Harms Act: An OpenMedia FAQ
Avatar image of Matt Hatfield

Explaining Bill C-63, The Online Harms Act: An OpenMedia FAQ

The good, the bad, and what must be fixed in Bill C-63.

In development since 2021, Canada’s Online Harms Act has now been introduced as Bill C-63 in the House of Commons - and it’s a hugely mixed bag. There’s some huge wins for OpenMedia’s community in this version of the bill, with many of the worst ideas Canada’s government had explored gone, and some smart, targeted measures for protecting young people in Canada put front and centre.

But Bill C-63 is FAR from ready to go. Alongside the portion of C-63 that creates the Online Harms Act itself our government has shoehorned in sweeping changes to Canada’s Criminal Code and Human Rights Act that could have a profound chilling effect on speech in Canada, frightening Canadians away from political protest and sensitive but necessary social topics. To stop the chill, these portions must be drastically amended or removed before Bill C-63 can be passed into law

Read on for our detailed breakdown of what Bill C-63 currently does, what we like in it, and what MUST be changed and fixed before C-63 can be passed.

1. What does Bill C-63 actually do?

2. What’s good in Bill C-63?

3. What’s wrong with Bill C-63?

4. Why not just vote down Bill C-63?

5. Shouldn’t we do more to address online hate?

6. Can we really separate the good in Bill C-63 from the bad? 

7. So what can I do?


 

1. What does Bill C-63 actually do?

It creates three primary duties for social media services above a significant user threshold:

I. A duty to act responsibly: principally by developing, implementing, and reporting on the success of a risk plan for mitigating the impact and spread of 7 types of harmful content on their service:

  1. Child sexual abuse or revictimizing material;
  2. Content encouraging children to harm themselves;
  3. Content used to bully children;
  4. Nonconsensually shared adult content;
  5. Content that incites violence;
  6. Content that incites violent extremism, or terrorism;
  7. Content that foments hatred.

II. A narrow takedown duty to remove content their users report that appears to be child sexual or physical abuse related content, or nonconsensually shared adult material, within 24 hours; importantly, both uploading and reporting users can appeal the platform’s decision and are entitled to human review, if they disagree with it. 

III. A duty to protect children: platforms that permit children and teens to register must design their service with their special needs and vulnerabilities in mind, and implement industry standard and regulator developed features to protect them.

IV. Beyond these three duties, platforms must also adopt some smaller protective measures, including:

  1. Empowering their users to easily block other users, speak with a human platform representative about harmful content or content takedowns, and a mandated content reporting system with a right to appeal for both posters and reporters of potentially concerning content;
  2. Attempt to identify harmful content spread by bots, and flag it with their users as bot-generated.

To implement these objectives, Bill C-63 creates a Digital Safety governance system in three parts: 

  1. A Digital Safety Commission of 3-5 people who will enforce the Act, including deciding if the risk mitigation plans proposed by designated platforms are adequate; issuing takedown notices for child abuse or nonconsensually shared adult content; and developing parliamentary reports on the impact of the Act.
  2. A Digital Safety Ombudsman, who will review complaints about individual harms from users; support them with educational resources; and advocate for the public interest around online safety.
  3. A Digital Safety Office of staff who will support both offices.

Most problematically, C-63 updates the Criminal Code and Canadian Human Rights Act with harsh new penalties and an unusual ‘pre-crime’ status.

  1. A new hate offence is created that can be added to any other crime, with a maximum sentence of life imprisonment;
  2. The crime of advocating or promoting genocide has its maximum penalty increased from 5 years to life imprisonment;
  3. It provides a right to for Canadians to bring online hate speech complaints to Canada's Human Rights Tribunal, with a maximum penalty of $70,000 for the speaker;
  4. A new and deeply problematic type of ‘pre-crime’ is created in section 810.012– the status of being assessed as ‘likely’ to commit an online hate offense.

With the approval of a judge and provincial attorney general, the last power allows a ‘recognizance’ to be placed on Canadians they believe fit this category - a set of binding conditions an individual must follow or face criminal penalties. It is not necessary for the individual in question to have been previously convicted of any crime, hateful or otherwise, and the list of possible conditions is long, including wearing a tracking device, curfew, drug tests, alcohol abstinence, and being banned from a designated place, all for up to two years.

2. What’s good in Bill C-63?

We’ve called the Internet-focused parts of Bill C-63 a “night and day improvement” on the government’s 2021 proposals, and we mean that. The portions of C-63 that create the new Online Harms Act itself are largely carefully written, with narrow specific targets that recognize how the Internet works, and how platforms are likely to respond to the legislation. Let’s talk through what C-63 gets right:

The Online Harms Act Portion of Bill C-63 chooses the right targets, and asks them the right questions. Most of the web is scoped out, while only large social media and streaming services are scoped in. These services are asked to develop and publish their own risk mitigation strategies, let all of us read their approach, and demonstrate to us how well that strategy is working. Platforms will no longer be able to report only facts that make them look good. Instead, they’ll have to account for any of the 7 illegal harms occurring on their platforms, describe how they’re addressing them, and learn from each other about what works and what does not. 

What’s more, if C-63 passes, we won’t be relying on platform disclosures alone. Academic researchers can apply to work with the anonymized data platforms provide to the Digital Safety Commission, and study what’s going on on platforms for themselves. Right now, when any of us try to understand what’s going on on the Internet we rely on data platforms selectively provide when it paints them in a positive light, and scattershot information from leakers like Francis Haugen to learn the truth. Bill C-63 will help us secure a thorough factual understanding of how and why illegal harmful content spreads on online platforms– something we’ve never really had before.

The worst and most easily detected material rightly gets the toughest treatment. Child abuse material is some of the ugliest content on the entire Internet, and much of it is readily identifiable by AI detection. Nonconsensually shared adult content is only a bit more difficult, once the depicted Internet user provides a sample of the content they want to see removed. That makes focusing the decision to demand automatic takedowns for these two content types, and only these two, an intelligent decision in Bill C-63 we support.

Safety by design for the accounts of children makes sense. We cannot kid-proof the whole Internet for children to experience every part of it, and we shouldn’t try. But asking platforms to adopt industry standard child safety features for their child accounts makes a great deal of sense. Of course, young people have both expression and privacy rights; but they also face unique risks and challenges. They deserve a distinct regulatory approach that accounts for their needs.

A lot of bad ideas Canada had considered are left out. Bill C-63 learns from what other democracies have gotten right and wrong. Good learnings include:

  • Private messaging is excluded (Section 6(1)), and encrypted communications will not be threatened;

  • There is no mandatory takedown requirement for any speech content. Platforms are free to assess nuances of speech, and judge for themselves what content is appropriate to their user base;

  • There is no proactive surveillance of every Canadian to search for illegal content (Section 7(1)). This is a sharp contrast with the vast surveillance web we found in 2021’s proposal. Bill C-63 will rely on the same user report system most current platforms already provide to detect illegal content, with clarified obligations for how they should support their users with tools, human oversight, and and a takedown appeal system.

3. What’s wrong with Bill C-63?

Frustratingly, the government has jammed together the Online Harms Act portion of Bill C-63 and huge changes to Canada’s Criminal Code and Human Rights Act that risk deeply chilling Canadians’ lawful online speech. 

These include maximum sentences of life imprisonment for advocating or promoting genocide, and for a new category of ‘hate offense’ which can be added to any other crime. Under changes to the Human Rights Act, individual Canadians will be able to apply to investigate the online speech of other individuals in Canada and seek penalties of up to $70,000 against the person posting it, with $20,000 awarded directly to them as a possible victim of that speech. This tribunal process is not obligated to follow normal rules of evidence in court, or evaluate whether the offending speech was true. 

No less an authority than Canada’s former Chief Justice of the Supreme Court, Beverly Maclachlin has commented that these proposals will face severe constitutional challenges should they become law. But you don’t need to be a constitutional scholar to see that these proposals go too far, fail to meet basic proportionality and administration of justice tests, and risk being misused to harass and silence lawful speech.

That’s unfortunately not all. Changes to section 810.012 of the Criminal Code create a new type of ‘pre-crime’; bond conditions that a judge and attorney general together could impose on someone in Canada they believe is at high risk of voicing online hate, even if they’re never committed a crime, hateful or otherwise. 

The government has justified this new ‘pre-crime’ as similar to peace bond conditions that exist already for those at imminent risk of committing terrorism or domestic violence. This is a very poor argument. Existing peace bonds are formed when there is a plain and imminent risk of violence to a specific person or people, not for speech that could bring theoretical, unproven harm to someone. There’s simply no direct comparison to be made here.

Civil liberties advocates, academics, and even Canada’s largest newspaper are calling these proposals out as poorly designed, disproportionate, and not belonging in Bill C-63. OpenMedia is in full agreement; these portions of Bill C-63 must go. 

4. Why not just vote down Bill C-63?

Because we don’t have to lose C-63’s good ideas to remove its bad ones. It would be a mistake to throw out all the nuanced, balanced portions of Bill C-63 for improving the worst parts of the Internet because of a deeply broken but non-essential part of the Bill.

If Bill C-63 leaves Parliamentary committee with no major amendments to fix its speech chilling provisions, OpenMedia will regretfully advise MPs to vote AGAINST Bill C-63. But there is time and space between now and then for MPs to fix the Bill and give us legislation almost all Canadians can get behind. 

As compared to competing bills like Bill S-210, much of Bill C-63 is a package of smart, nuanced protections for young people that could materially improve their online lives, without endangering the privacy or freedom of expression of anyone. But that will ONLY be true if the extreme changes it proposes to our online speech are fixed.

5. Shouldn’t we do more to address online hate?

You can think that, and still be concerned about Bill C-63’s approach. Most people are unhappy with some posts we see on the Internet. There’s a great deal of what’s called “lawful but awful” speech out there – plainly offensive and disrespectful expression, and “dog whistles” that can be interpreted as signalling unexpressed hateful thoughts. 

Reasonable people will disagree about whether some speech is merely offensive and awful, poses a threat or harm to people, or provocatively advances a crucial social debate. That’s more obvious in 2024’s social environment than ever. Some European democracies have somewhat tougher hate speech laws than Canada, and our southern neighbours have no hate laws at all; both approaches seem to be compatible with a flourishing democracy with widespread expression and participation.

But no other Western democracy has created “pre-crime” limitations on people for risk of offensive speech, as opposed to direct violence. No smart democracy floods their human rights adjudication system with complainants who not only bear no personal cost for frivolous or malicious complaints; but who actually have a direct financial incentive to test their luck. And a careful legislature does not create extraordinary new life imprisonment offenses with no clear picture of why they’re doing so; of what existing crimes have been under-penalized they believe must be handled differently.

Even if you believe Canada’s current approach to illegal hate speech could use tightening, we believe you should see C-63’s approach to speech is extreme and inappropriate as currently written. If any changes to Canada’s hate laws are needed, they should be as carefully devised and tested for problems as the rest of C-63, not smuggled in undercooked alongside more mature measures.

6. Can we really separate the good in Bill C-63 from the bad? 

Yes. The Criminal Code and Human Rights Act changes in Bill C-63 are not integral, necessary parts of this legislation. They’re not even part of the text of the new Online Harms Act Bill C-63 creates! 

Even if we separate these sections, there’s going to be work to do before C-63 is entirely ready to go. Some further work will be needed to better define and oversee the actions of the Digital Safety Commission, to ensure the new body’s powers are used appropriately. 

But this will be cleanup work on that’s a typical part of the amendment process that a Parliamentary committee engages in, and a much less contentious and risky process. And it won’t carry the obvious risk of the extreme and draconian punishments the government has packaged into Bill C-63 being wrongfully applied and hurting Canadians’ fundamental rights.

If our MPs do the right thing and split Bill C-63, expect to hear more from us in our community at at committee about how else to tighten Bill C-63 up.

7. So what can I do?

Tell your MP about your concerns with Bill C-63. Use OpenMedia’s C-63 action in just a few seconds to share your thoughts! We’ve started you with some draft text, but please feel free to edit as you choose and add your (respectful) personal take.

You might think no one is listening. But we meet with MPs all the time on your behalf, and we can tell you that expressing yourself directly to your MP makes a HUGE difference

Every MP needs to know there’s a win-win path forward with Bill C-63. Canadian children don’t need to keep waiting for crucial protections from bullying and abuse, while our government debates Bill C-63’s extreme speech punishment provisions. Split Bill C-63, keep the good, and dump the bad!


If you haven’t taken action yet, click here!



Take action now! Sign up to be in the loop Donate to support our work