Image for Facial recognition: Four things you need to know
Avatar image of Victoria Henry
Canada Access Internet Choice & Affordability Free & Open Internet

Facial recognition: Four things you need to know

Facial recognition: most of us have heard the term, and maybe even experienced it when our faces are recognised and tagged on social media, or to unlock our phones. But do you know what the concerns are, and whether it’s being used in public spaces in Canada? Here’s four things you should know.

First things first: what is facial recognition?  Quite simply, facial recognition technology takes images of your face, for example from a photo or from surveillance video footage, and uses algorithms to match it to faces in a database to establish your identity.

1. Facial recognition is unregulated in Canada - but it’s already happening here

Several police services, including Calgary and Toronto, have already been using facial recognition programs, which they claim are only being used to match crime scene footage to mugshot databases.

Several other police services, including Montreal and Halifax, refuse to confirm or deny whether they are using facial recognition.  The lack of laws dealing with this technology mean that there is very little anyone can do to verify these claims, or get a straight answer as to whether we are being targeted by facial recognition.

But it’s not just police who are using facial recognition in Canada. On the commercial side, the Office of the Privacy Commissioner is currently investigating the use of facial recognition technology at top shopping malls in Canada operated by Cadillac Fairview, who claim the technology is used to monitor the age and gender of shoppers.

These systems have crept in without any kind of regulation, which means there’s no oversight, no accountability, and no transparency. The Personal Information Protection and Electronic Documents Act (PIPEDA) regulates the use of personal information by private enterprises. The Privacy Act does the same for government. But neither of these Acts specifically mention biometric data such as facial recognition. That’s because both of these Acts are woefully out of date, and need to be updated to address modern technologies.

2. Facial recognition is inaccurate and biased

In test after test, facial recognition systems have been found to be worryingly inaccurate, and biased in terms of their ability to recognise women and people of colour.

Amazon's Rekognition facial recognition software, which police departments throughout the U.S. use, has a higher error rate  when identifying the gender of women or darker-skinned people

In a test run by the ACLU, Rekognition software incorrectly matched 28 members of Congress to mugshots, identifying them as other people who have been arrested for a crime. The false matches disproportionately affected people of colour. 

More than 2,000 people in Cardiff, UK were wrongly identified as potential criminals during the 2017 Champions League final, and the system used at the Notting Hill Carnival in London, UK was found to be wrong 98% of the time.

Beyond the technology itself, there is also enormous evidence of discrimination in the deployment of facial recognition technology. Facial recognition has been rolled out overwhelmingly in low-income neighbourhoods and communities of colour, such as low-income housing requiring residents to be scanned to enter their own homes, deployment in communities around Detroit, and at festivals drawing predominantly people of colour such as London’s Notting Hill Carnival

3. It’s almost impossible to stop your face from being used in facial recognition databases

Half of adults in the United States are now in facial recognition databases, and Canada won’t be far behind if we don’t slam the brakes on the spread of this technology.  

Many facial recognition systems are built on images from drivers licenses and passport photos. Here in Canada, ICBC attempted to provide the RCMP with their facial recognition database, made up of drivers license images, but they were (at least for now) stopped from doing so by the Office of the Privacy Commissioner. 

Publicly-available photos from social media have also been used to train the AI of facial recognition systems. IBM has come under fire for scraping nearly a million photos from Flickr, without notifying or seeking consent from any of the individuals who uploaded them

Some facial recognition systems used by law enforcement use mugshot databases - but this also has major issues. Many mugshot databases include images of people who have been arrested but not found guilty. And because more people of colour are targeted by police, it increases the likelihood of their images being included in these databases.

4. Facial recognition is a slippery slope

The slippery slope of facial recognition leads to real-time, nonstop surveillance, where daily activities such as shopping and travelling around your community, or activities like attending a protest or political event are monitored, recorded and stored. There is no way to know how this information could be used in the future, or who it could be shared with.

In other countries around the world, limited use of facial recognition by law enforcement has typically led to greater and much broader rollouts of the technology.

In the U.S., President Trump has issued an executive order requiring facial recognition identification for 100% of international travellers in the top 20 U.S. airports by 2021.

In the UK, facial recognition is already being used at sports matches, street festivals, protests, and even on the streets to constantly monitor passers-by. 

Some of the world’s most extreme examples come from the Xinjiang region of China, where the government surveils  millions of ethnic Uyghur people with facial recognition technology to control access to all areas of public life including parks, public transportation, malls, and city boundaries.

Even facial recognition technology itself is a slippery slope. Tracking technology does not end at simply mapping faces. Even more intrusive and personal technologies are on the horizon, such as automated facial emotion detection, skin texture recognition and even vein mapping.

It’s time we asked ourselves if this is the future we really want. We have precious little time before this technology becomes a central part of surveillance in Canada. The fact that it’s largely unregulated in this country is a huge threat - but also an opportunity to get it right. 

We need to press the pause button, now. A full moratorium on the use of facial recognition technology by law enforcement is an absolute must, while issues of bias and inaccuracy are investigated. And our government must urgently reform the out-of-date Privacy Act and PIPEDA, using a full and transparent public consultation, to make sure that technologies like this can no longer be used to experiment on people in Canada without any protections for our privacy rights. 

Sign the petition now!