Police use of facial recognition tech resumes with guardrails
Critics argue the law governing its use is still too broad.
Law enforcement across Virginia may, once again, use facial recognition technology as an investigative tool. The software uses facial features to identify a person. Police can use it to match a person’s face to a database of collected images.
This high-tech tool has drawn scrutiny from state legislators and privacy advocates over the past several years, but some lawmakers hope new legislation will provide more guidance and limitations on how it is used.
Virginia’s Legislature passed a ban on local law enforcement using the technology in 2021 after The Virginian–Pilot reported several officers were using software provided by Clearview AI to investigate crimes as part of a free trial of the technology. Clearview maintains a database of publicly available images swept from the internet.
According to The Washington Post, the company claims its database contains over 100 billion facial images. The world’s population is 7.8 billion.
Virginia State Sen. Scott Surovell (D–Fairfax) and Del. Jay Leftwich (R–Chesapeake) co-sponsored a bill that permits use of facial recognition technology — with some restrictions.
“In Virginia, we basically had zero standards on facial recognition technology,” said Surovell, who is also a civil and criminal attorney. “I thought it was important to get some sort of minimum baselines in place for law enforcement to follow if they’re going to use this technology to investigate crime.
The law, which passed in 2022 with bipartisan support, restricts law enforcement to 14 instances when they can use the technology.
Police are currently authorized by the 2022 law to use facial recognition to help identify a person in 14 scenarios. Some are broader than others:
- "reasonable suspicion" of someone who has committed a crime-believed to have committed a crime
- a victim of a crime, including those "of online sexual abuse material"
- possible missing persons or witnesses to criminal activity
- a victim of human trafficking or "involved in the trafficking of humans, weapons, drugs, or wildlife"
- an online recruiter of criminal activity
- a disabled person or person with a mental/physical disability impairing the ability to communicate and be understood
- an unidentified dead person
- someone who is incapacitated and unable to identify themselves
- a person who is "reasonably believed" to be a danger to themselves or others
- someone who is lawfully detained without identification
- mitigate "an imminent threat to public safety, a significant threat to life, or a threat to national security, including acts of terrorism"
- "ensure officer safety as part of the vetting of undercover law enforcement"
- possible perpetrators of identity theft
- "a person who an officer reasonably believes is concealing his true identity and about whom the officer has a reasonable suspicion has committed a crime other than concealing his identity"
The law prohibits real-time use of facial recognition technology, which means law enforcement is not allowed to use the technology on live video streams. The law also requires that any facial recognition algorithm must have earned a 98% accuracy rating from the National Institute of Standards and Technology, which vets the technology through its Face Recognition Vendor Test.
Police departments are also required to publicly disclose if they are using the technology and in which cases it's being applied.
Surovell sees the crime solving potential for the tool, “Some of my constituents start(ed) complaining on NextDoor about porch pirates, people coming onto their porch and stealing their packages that were delivered. People would post these videos and say, ‘Does anybody know this guy?’ But, if you could put that picture on facial recognition technology, they could probably find them pretty quickly.”
On the other hand
While proponents of the law point to the technology’s benefits, critics argue Virginia’s law is too broad.
Alison Powers is the director of policy and education at the Virginia Indigent Defense Commission, a state agency that oversees public defenders.
“It will sweep up clients of color, clients who are indigent and people who typically are involved in property crimes and low-level offenses,” Powers said. “The things that are typically captured on video.”
Historically, facial recognition technology has disproportionately misidentified people with darker skin pigmentation. A Harvard study found Black women between the ages of 18-30 were the most likely to be misidentified.
Virginia’s law is the first in the nation to require that facial recognition technology meets federal standards, but Powers argues the NIST accuracy testing doesn’t apply to real-world scenarios.
“Many algorithms get very high marks when you’re comparing a mug shot photo to another mug shot photo,” Powers said. “When you change one of those photos to a more blurry photo — like a kiosk photo, surveillance photo — the accuracy levels dropped significantly.”
Across the U.S. over the past several years, there have been four instances: two in Michigan, one in New Jersey and most recently one in Louisiana, where police have improperly charged Black men in crimes after using the tool.
There are no known cases in Virginia of police misidentifying a suspect using facial recognition technology.
How it's being used in Virginia right now
For more than a decade, Virginia State Police has been using facial recognition software to compare images to a database compiled entirely of mug shots.
Lt. Col. Tricia Powers, who is not related to Alison Powers, oversees VSP’s use of the tool. She emphasizes the technology should only be used as part of a police investigation.
“The algorithms have come a long way through the years. They’re much more accurate than they used to be,” Tricia Powers said. “But they’re not there yet where they can provide a positive match.”
The Virginia State Police uses a vendor named DataWorks to scan its database for possible matches in a query. She explained that trained technicians run tests which produce candidate lists. Law enforcement can use those lists to narrow down potential suspects or identify victims.
Tricia Powers said misidentifications can occur when officers do not fully understand the technology.
“It was a lack of training on the part of law enforcement, when the software spit out a candidate instead of realizing that this is a tool. It’s not a match like a fingerprint,” she said about the misidentifications made outside of Virginia. “Sometimes enforcement action was taken when it shouldn’t have been.”
Attorney Alison Powers says a database like the VSP’s is the best option but worries about what may happen when the technology becomes more widespread.
“We will gather more data and stories of how the technology itself is flawed because it’s human made,” Alison Powers said. “So, if a human is biased that’s going to be built into the technology itself.”
Lawmakers hope the new restrictions on the technology will prevent misuse.
“There’s pluses and minuses to all these types of technologies,” Surovell said. “What’s important is that we have clear standards so that people can’t just use them whenever and however they feel like using them.”
The Virginia State Police released a model policy to guide local law enforcement and campus police in further use of facial recognition technology. Police departments can either choose to adopt this policy or develop their own with stricter guidelines.