Vendors of face recognition are updating their tech as people don masks to protect against Covid-19. Police are bound to take notice.
What’s new: Companies that provide computer vision systems, including at least one that supplies law enforcement agencies, are training models to recognize obscured faces, according to USA Today. Worldwide protests in support of civil rights for Black people have energized police interest in the technology while reigniting concerns about potential violations of civil liberties.
What’s happening: With people’s noses, mouths, and chins obscured by masks, companies are retraining face recognition models to identify people based only on their upper faces. Some claim to have solved the problem.
- Rank One Computing, which provides face recognition systems to 25 U.S. police forces, recently upgraded its system to identify people by eyes and eyebrows.
- SAFR, which markets to its technology schools, claims its system recognizes masked faces with 93.5 percent accuracy, but only under perfect conditions.
- U.K.-based AI firm Facewatch, which targets retail companies, says its models recognize masked individuals.
- Several municipal and federal law enforcement agencies in the U.S. have collected face imagery from protests held in recent weeks.
- In March, researchers from Wuhan University released a trio of simulated and real masked-face datasets, including one with 5,000 real-world examples. The following month, U.S.-based startup Workaround published a dataset that contains 1,200 masked selfies scraped from Instagram.
Behind the news: Many face recognition models have trouble identifying individuals even without masks, particularly members of minority groups, according to the U.S. National Institute of Standards and Technology. The agency announced plans to test the accuracy of masked face detection but suspended the effort amid the pandemic.
Why it matters: Many U.S. law enforcement agencies are using face recognition to identify protesters. The questionable accuracy of these systems — particularly those aimed at masked individuals — could exacerbate the very injustices the current protests aim to highlight.
We’re thinking: Face recognition technology cannot achieve its potential for good until the public can trust these systems are accurate and free of bias, both institutional and algorithmic.