Image for Canada must regulate AI right, not first
Avatar image of OpenMedia

Canada must regulate AI right, not first

OpenMedia's testimony to Canada's Parliament on how to regulate our privacy and artificial intelligence effectively.

On Nov 2 2023, we testified at the House Committee of Industry and Technology on the impact and importance of Bill C-27, Canada's pending private sector privacy and artificial intelligence regulation bill. Our message: close the loopholes on privacy regulation and get those protections passed, but give AI regulation it a second look. It's better to take the time to get AI rules right rather than be the first to roll them out. There's a push for no shortcuts—do the hard work now to avoid issues later. We call for sending AIDA, our artificial intelligence regulation Act, back for full public consultation, and ensuring regulating AI isn't made the exclusive job of the innovation ministry that is also subsidizing AI development. Our full opening remarks are attached below.

House Committee of Industry and Technology

Re: Bill C-27, an An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Good afternoon. I’m Matt Hatfield, and I’m Executive Director of OpenMedia, a grassroots community of nearly 300,000 people in Canada that work together for an open, accessible and surveillance free Internet. I am speaking to you today from the unceded territory of the Tsawout, Saanich, Cowichan, and Chemainus Nations.

What to say about Bill C-27? One part is long overdue privacy reform, and your task is to close its remaining loopholes and get the job of protecting our data done. One part is frankly undercooked AI regulation you should take out of C-27 altogether, and take your time to get right. I can’t address both at the length they deserve, and I shouldn’t have to.

But we are where the government has forced us to be– so let’s talk privacy. There’s some great changes in C-27. This includes real penalty powers for the OPC, and the Minister’s promised amendment to entrench privacy as a human right. OpenMedia hopes this change to PIPEDA will clearly signal to courts that our ownership of our personal data is more important than a corporation’s interest in profiting off that data. But any regulatory regime is only as strong as its weakest link. It does no good for Canada to promise the toughest penalties in the world if they’re easy to evade in most real world cases. And the weaknesses of C-27 will absolutely be attacked by companies searching for weaknesses who wish to do Canadians harm. That’s why it’s critical that you remove the consent exceptions in C-27, and give Canadians the right to ongoing, informed, withdrawable consent for all use of our data.

While you’re fixing consent, you must also broaden C-27’s data rules to apply to every non-governmental body. This includes political parties; nonprofit organizations like OpenMedia; and vendors that sell data tools to any government body. No other advanced democracy tolerates a special exception to respecting privacy rules for the same parties who write privacy law; that’s an embarrassing Canada original, and it shouldn’t survive your scrutiny of this bill.

Privacy was the happier side of my comments on C-27. But let’s talk AI. I promise you: our community understands the urgency to put some rules in place on AI. Earlier this year, OpenMedia asked our community what they hoped for and were worried about with generative AI. Thousands of people weighed in and told us they believe this is a huge moment for society: almost 80% think this is bigger than the smart phone, and 1 in 3 of us think it will be as big or bigger than the Internet itself. Bigger than the Internet is the kind of thing you are going to want to get right!

But being first to regulate is a very different thing than regulating right. Minister Champagne is at the UK’s AI safety conference this week, telling media the risk is “doing too little, not too much”. Yet at the same conference Rishi Sunak used his time to warn that we need to understand the impact of AI systems far more than we currently do in order to regulate them effectively, and that no regulation will succeed if countries hosting AI development do not develop their standards in close parallel. That’s why the participants of that conference are working through foundational questions about exactly what is at stake and in scope right now. It’s an important, necessary project, and I wish them all success with it.

But if they’re doing that work there, why are we here? Why has this committee been tasked with jamming AIDA through within a critical but unrelated bill? Why is Canada confident that we know more than our peers about how to regulate AI, so confident that we’re skipping the basic public consultation even moderately important legislation normally receives?

I have to ask: is AIDA about protecting Canadians, or is it about creating a permissive environment for shady AI development? If we legislate AI first, without learning in tandem with larger and more cautious jurisdictions, we’re not going to wind up with the best protections. Instead, we’re positioning Canada as a kind of AI dumping ground, where businesses whose practices are not permitted in the US and EU can produce rights-violating, even dangerous models. I am worried that that’s not a bug; that that’s the point. That our innovation ministry is fast-tracking this legislation precisely to guarantee Canada will have lower AI safety standards than our peers.

IF generative AI is a hype cycle whose products will mostly underwhelm, then this is much ado about not much, and there is no need to rush this legislation. But if it is even a fraction as powerful as its proponents claim, failing to work with experts and our global peers on best in class AI legislation is a tremendous mistake.

I urge you to separate AIDA from C-27, and send it back for a full public consultation. But if that isn’t in your power, at very least, you cannot allow Canada to become an AI dumping ground. That’s why I urge you to make the AI commissioner report directly to you, our Parliament, not to ISED, a ministry whose mandate to sponsor AI will give it a strong temptation to look the other way on shady practices. That commissioner should be charged with reporting to you yearly on the performance of AIDA and gaps that have been revealed in it. And I urge you to mandate parliamentary review of AIDA within two years of C-27 taking effect to decide if it must be amended or replaced.

Since PIPEDA reform was first proposed in 2021, OpenMedia’s community has sent more than 24,000 messages to our MPs demanding urgent and comprehensive privacy protections. In the last few months, we’ve sent another 4,000 messages asking our Parliament to take the due time to get AIDA right. I hope you will hear us on both points. Thank you and I look forward to your questions.



Take action now! Sign up to be in the loop Donate to support our work