It’s surprising how quickly public opinion can change. Winding the clocks back 12 months, many of us may have looked at a masked individual in public with suspicion. Now, some countries have enshrined face mask use in law. One consequence of this is that facial recognition systems in place for security and crime prevention may no longer be able to fulfil their purpose.
In Australia, for instance, most agencies are silent about the use of facial recognition. But documents leaked earlier this year revealed Australian Federal Police and state police in Queensland, Victoria and South Australia all use Clearview AI, a commercial facial recognition platform. New South Wales police also admitted using a biometrics tool called PhotoTrac.
What is facial recognition?
Facial recognition involves using computing to identify human faces in images or videos, and then measuring specific facial characteristics. This can include the distance between eyes, and the relative positions of the nose, chin and mouth. This information is combined to create a facial signature, or profile. When used for individual recognition – such as to unlock your phone – an image from the camera is compared to a recorded profile. This process of facial “verification” is relatively simple.
However, when facial recognition is used to identify faces in a crowd, it requires a significant database of profiles against which to compare the main image. These profiles can be legally collected by enrolling large numbers of users into systems. But they are sometimes collected through less direct means, such as via social media. (Clearview AI, for instance, “uses a process called data scraping to scour the internet for what it states are “public” photos to stockpile its database. This includes gathering images from Facebook, Instagram, Twitter, YouTube, LinkedIn, Venmo, and other social media applications.”).
The problem with face masks
As facial signatures are based on mathematical models of the relative positions of facial features, anything that reduces the visibility of key characteristics (such as the nose, mouth and chin) interferes with facial recognition. There are already many ways to evade or interfere with facial recognition technologies, with some of these evolving from techniques designed to evade number plate recognition systems. Although the coronavirus pandemic has escalated concerns around the evasion of facial recognition systems, leaked U.S. documents show these discussions taking place back in 2018 and 2019, too.
And while the debate on the use and legality of facial recognition continues, the focus has recently shifted to the challenges presented by mask-wearing in public. On this front, the U.S. National Institute of Standards and Technology (“NIST”) coordinated a major research project to evaluate how masks impacted the performance of various facial recognition systems used across the globe. Its report, published in July, found some algorithms struggled to correctly identify mask-wearing individuals up to 50 percent of the time. This was a significant error rate compared to when the same algorithms analyzed unmasked faces. Some algorithms even struggled to locate a face when a mask was covering too much of it.
Finding ways around the problem
There are currently no usable photo data sets of mask-wearing people that can be used to train and evaluate facial recognition systems. The NIST study addressed this problem by superimposing masks (of various colors, sizes and positions) over images of faces. While the study may not have included realistic portrayals of a person wearing a mask, it was effective enough to study the effects of mask-wearing on facial recognition systems.
It is possible images of real masked people would allow more details to be extracted to improve recognition systems – perhaps by estimating the nose’s position based on visible protrusions in the mask. Many facial recognition technology vendors are already preparing for a future where mask use will continue, or even increase. One U.S. company is offering up masks with customers’ faces printed on them, so they can unlock their smartphones without having to remove it.
Growing incentives for wearing masks
Even before the coronavirus pandemic, masks were a common defense against air pollution and viral infection in countries including China and Japan. Political activists also wear masks to evade detection on the streets. Both the Hong Kong and Black Lives Matter protests have reinforced protesters’ desire to dodge facial recognition by authorities and government agencies.
As experts forecast a future with more pandemics, rising levels of air pollution, persisting authoritarian regimes and a projected increase in bushfires producing dangerous smoke – it is likely mask-wearing will become the norm for at least a proportion of us. With that in mind, facial recognition systems will need to adapt. Detection will be based on features that remain visible such as the eyes, eyebrows, hairline and general shape of the face. Such technologies are already under development. Several suppliers are offering upgrades and solutions that claim to deliver reliable results with mask-wearing subjects.
For those who oppose the use of facial recognition and wish to go undetected, a plain mask may suffice for now. But in the future they might have to consider alternatives, such as a mask printed with a fake computer-generated face.
Paul Haskell-Dowland is the Associate Dean of Computing and Security at Edith Cowan University.