Image for Canada’s Age-Verification Bill Repeats the UK’s Mistakes: Here’s How to Do Better
Avatar image of Rey Qu

Canada’s Age-Verification Bill Repeats the UK’s Mistakes: Here’s How to Do Better

Canada is exploring age-verification requirements under Bill S-209, but adopting such systems without strong safeguards could repeat mistakes seen overseas. The UK’s failed experiment offers a warning; Canada still has time to choose a better path.

Across the world, governments and parents are looking for ways to make the internet a safer place for children. Tragic stories of teens harmed by online content, from bullying to suicide after interacting with unsafe chatbots, have heightened concerns about youth safety online. Countries like the UK and Australia have rolled out or proposed age-verification systems as part of this global push for “online safety.” In the UK, groups like Internet Matters, help families set privacy controls and talk about digital consent, while in Australia, the eSafety Commissioner offers guides and webinars to help parents protect kids online.  

In Canada, the Canadian Centre for Child Protection (C3P) has called for adding age-assurance requirements to online-harms legislation, arguing that existing safeguards fall short of protecting minors. Meanwhile, Parliament has been considering Bill S-209 and its earlier version, Bill S-210. While S-209 is “new” in Parliament, it still largely mirrors core provisions of Bill S-210, which was introduced in 2022 but stalled before passing. The updated Bill S-209 creates an offence for any organization that, for commercial purposes, makes pornographic material available on the Internet to persons under 18.

Supporters argue that this will protect minors from harmful online content. However, critics are warning that this will still apply to general purpose platforms, not only pornography-focused services. While Bill S-209 uses narrower language and responds to some of the criticisms of Bill S-210, the bill’s sponsor told the committee that the government is free to apply the age-verification requirements—and presumably site-blocking orders—to any site, not just pornography sites. On top of that, the bill allows court-ordered blocking of websites deemed non-compliant, which may create serious risks for lawful expression and access to information. While the bill mentions privacy, it leaves big gaps around how personal or biometric data would be collected, stored, or protected. To understand how these laws could backfire, we should look abroad.

Canada is not the first country to take on this issue. The United Kingdom's recent experience provides a valuable lesson. When its Online Safety Act introduced strict age-verification rules, automated filters ended up blocking LGBTQ+ forums, and sexual-health education. Canada now faces a choice: repeat these mistakes or learn from them. By learning from their story and others, we can find a better path for Canada.  

The UK's Experiment with Age Checks

The UK offers an early case study of how well-intentioned online-safety laws can go wrong. The UK first attempted nationwide age verification in 2019 under the Digital Economy Act. It required adult websites to verify users’ ages through third-party companies, with the British Board of Film Classification as a regulator. However, after repeated delays and privacy backlash, the government officially dropped the rollout in October 2019.

The idea returned a few years later under the Online Safety Act 2023, which expanded the scope far beyond adult websites.

This law imposed massive fines of 10 percent of global revenue (or 18 million) for non-compliance. To avoid penalties, many platforms have overreacted. They implemented automatic filters which aren't very smart and can't understand context. For this reason, the result has been widespread over-blocking. They block far more than just adult sites. Legitimate Reddit communities for LGBTQ+ support, health information pages, and even educational resources were caught in the net, all because of certain keywords. Safe online spaces where people found community and help disappeared overnight.

The policy also raises serious privacy concerns. To prove their age, users were asked to upload government IDs, like driver’s licenses or even take live face scans. These checks weren’t done directly by the websites themselves, but by outside “age verification” companies. This process meant that highly sensitive personal data was being shared with third-party verification companies. Once that information is out, controlling its security becomes incredibly difficult.

This policy also threatened the security of private messaging. Think about apps like Signal or WhatsApp; to meet child-protection requirements, companies are now required to scan private messages for harmful content. This is something experts see as a step toward weakening end-to-end encryption. In other words, messages that were supposed to be just between two people may now be monitored. Under this situation, people fear that they can no longer communicate securely and freely.

Despite these intrusive measures, the system still isn’t proving very effective. Kids who know their way around a computer are just using VPNs to bypass the blocks. The verification technologies adopted are proving often unreliable and easy to fool. Some smaller websites, facing costly compliance requirements and massive fines, have said they will have to shut down rather than risk penalties.

Finally, the system is overloaded. Platforms say they are suddenly getting millions of extra verification requests every day, leading to slow logins and even website crashes. According to the Age Verification Providers Association, over five million new checks are happening daily.

The UK’s experience shows that technical fixes cannot solve complex social problems by themselves. When privacy safeguards are weak and rules are written too broadly, efforts to protect children can end up erasing the education, artistic and community spaces that help people stay informed and safe. As Canada debates Bill S-209, it faces the same dilemma: how to safeguard young people online without turning the internet into a system of surveillance and censorship.

Potential Issues in Canada’s Approach

Canada’s Bill S-209 and Bill S-210 share many of the same goals as the UK’s Online Safety Act: keeping minors safe online. But experience abroad shows that good intentions can have unintended effects. Several parts of these proposals deserve a closer look.

This "over-blocking" problem is concerning, as the power to order website blocks could be misused. Bill S-209 creates an offence committed by any organization that makes pornographic material available to individuals under 18 for commercial purposes. While the bill attempts to limit its scope by excluding search engines and other services that only incidentally provide access to such content, the government would still be free to apply age-verification requirements, and potentially site-blocking orders, to a broader range of services, not only pornography sites. This leaves open the possibility that the bill’s reach could expand well beyond what the text currently suggests. 

It shuts people out and reinforces digital inequality. For many people, especially those in rural or marginalized communities, the internet is a great place for them to learn about sexual health, identity, and mental health issues. When age-verification systems over-block content, these groups risk losing access to legitimate sexual-health information and resources. Additionally, people who don’t have a government ID or their own device to pass the strict ID-verification system and complete the age checks, are at risk of being excluded or cut off unfairly from essential services, communities, and the information they rely on.

Meanwhile, creators, especially smaller ones, also face their own risk of being shut out. With fines of $250,000 for the first offence, the stake could crush them, while large tech companies can easily absorb the cost. In the UK, similar high costs motivated some small and volunteer-driven services to pre-emptively shut themselves rather than face legal risk they could not possibly absorb. 

Privacy safeguards are unclear. To verify age, Canadians may need to upload identification or personal data to a verification provider chosen by the site they’re dealing with, not them. Bill S-209 lacks strong rules on what happens to our data once it enters those vendors' hands. Who keeps our ID scans, and for how long? What happens if they leak or sell our data? Without clear regulations, the risk of major data leaks is significant.

Taken together, these gaps risk building an internet that is both harder to access and more intrusive than intended. Unless privacy and access safeguards are built in from the start, Canada could easily repeat the same mistakes seen abroad.

How Other Countries Balance Safety and Privacy

Other countries are showing that protecting children online does not have to come at the expense of privacy. Their experiences highlight practical ways to keep minors safe while respecting digital rights.

France: France has required websites displaying adult content in France to implement age-verification systems that meet a high privacy standard, using a “double-blind” verification model. “Double-blind” means adult websites never see the user’s identity, and age-verification providers cannot see which site is being accessed.

Australia: Australia has opted to "test, don't guess". Their eSafety Commissioner launched a national trial to study the options before enforcing any single one. While social media platforms must block users under 16 by December 2025, they are not required to check every user's ID. They are instead encouraged to use "softer measures," such as parental controls, reporting tools, or AI detection. This approach reflects the principle of “test before enforcement” and shows how governments can protect young users without over-policing the internet.

These examples show that strong privacy protections and child safety can coexist. France focused on preserving anonymity, and Australia on gradual implementation that can be responsive to emerging evidence, each offering lessons for Canada as it considers its own path forward.

 A Path Forward for Canada’s Online Safety Rules

Canada has an opportunity to protect children online while defending privacy and free expression. To make that possible, lawmakers should start with strong privacy safeguards and fair, transparent enforcement. 

  • Privacy Rules First. Before rolling out any new system, lawmakers should establish strong, clear rules for how our data is handled. Companies must explain what data they collect, how long they keep it, and who can access it. The framework should include independent audits and clear deletion rules. Newer age-assurance approaches––such as “double-blind” or token-based systems that confirm a user’s age without exposing who they are––are also worth exploring. This shows that age checks don’t have to come at the cost of building giant databases of ID scans. If Canada adopts age-assurance technology, it should prioritize privacy-protecting solutions that minimize unnecessary data collection.
  • Make it Fair. The rules shouldn't be one-size-fits-all. Regulations must be proportionate to risk and must "recognize the difference between large and small platforms." Fines and duties should match the size and capacity of an organization, so we don't drive small Canadian creators or community sites offline. Small Canadian websites shouldn't face the same massive fines as a tech giant.
  • Blocking is a Last Resort. Website blocking is an extreme measure. Website blocking should be rare, and never be applied to legal content. Even the risk of being blocked creates a "chilling effect". Platforms might just delete any content that feels "risky"—like health education, art, or LGBTQ+ support groups—just to be safe.
  • Test, Don't Guess. Canada should pilot and review new measures before making them law to ensure they truly work and protect privacy in practice. Test different ideas to find what actually works without sacrificing our privacy. This approach lets us look at more options. This includes "softer" tools that might work even better, like much-improved parental controls, AI that spots risky behaviour, or better reporting tools, rather than just forcing a high-risk ID check on everyone.

Get Involved: What can you do as an individual

If we want a system that addresses the worst online harms without sacrificing our freedom of expression or privacy, we cannot stay silent. It’s time to get online safety right. Ask our new government to pass sensible, rights-respecting online harms legislation.

Decisions made now will shape Canada’s digital landscape. Every message to Parliament reminds decision-makers that Canadians care about both safety and rights and that privacy should never be treated as optional.

Add your voice to demand online harms legislation that respects your rights and protects Canadians.



Take action now! Sign up to be in the loop Donate to support our work