Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations

Youngkin administration has missed deadline to set AI rules for state police

Ag. Miyares joins Gov. Youngkin and Del. Gilbert
Shaban Athuman
/
VPM News File
Virginia Attorney General Jason Miyares, right, speaks at a press conference on Thursday, December 12, 2024 at the Patrick Henry Building in Richmond, Virginia. Gov. Glenn Youngkin, center, and House Minority Leader Todd Gilbert, R–Shenandoah, left, listen.

The standards — ordered by the governor — are more than seven months overdue.

Gov. Glenn Youngkin’s administration and state Attorney General Jason Miyares are seven months behind a deadline set by the governor to outline standards for Virginia State Police’s use of artificial intelligence.

Youngkin signed an executive order in January 2024 implementing AI guidelines for state agencies, including standards “for the ethical use of AI” and a mandatory approval process to use the technology.

The order directed Miyares and Terry Cole, Virginia’s public safety and homeland security secretary, to develop rules for AI use “applicable to all executive branch law enforcement agencies and personnel” within nine months.

That deadline passed on Oct. 18, 2024, and specific rules for law enforcement are still not in place.

The delay comes as AI use among law enforcement continues to expand and draw concerns, from facial recognition technology, license plate readers and other surveillance systems to officers using it to write crime reports.

And while some regulations have passed, many recent legislative efforts aimed at further regulating the technology in Virginia have failed.

“We just see this constant growth of surveillance among law enforcement and this really consistent collection of data among people,” Steven Keener, an assistant professor of criminology at Christopher Newport University, told VPM News in an interview.

A Virginia State Police spokesperson declined to share how the agency uses AI, but told VPM News that its “High Tech Crimes Division does not utilize AI in their investigations.”

VSP’s AI use is regulated by Youngkin’s January 2024 executive order and another order from February 2025 that bans the use of AI from the Chinese firm DeepSeek on state government devices and networks, VSP spokesperson Matthew Demlein wrote in an email.

In Virginia, police can use facial recognition technology that uses AI to capture and match facial features to identify people “in photos, videos, or real time.”

After they instituted a ban on the tech in 2021, Virginia lawmakers passed a bill two years later to allow law enforcement to use facial recognition with some guardrails. A 2019 federal study found racial bias in many facial recognition systems, but researchers say improvements have been made.

Keener, who is also the director of CNU’s Center for Crime, Equity, and Justice Research and Policy, raised concerns over AI algorithms using biased data that can lead to discriminatory outcomes.

Supporters of AI-based tools for law enforcement argue they can help reduce bias that typically comes from the humans who program the systems, Keener told VPM News. But he said that view could minimize the scrutiny the systems themselves face.

“What I'm worried about,” Keener said, is that “we may lose sight on how these algorithms and how this technology can reproduce those historical biases and can potentially allow for racial biases in the system to continue to perpetuate without the same kind of interrogation we've had in the past.”

Keener said he’s also worried about the role played by private companies that profit off AI technology used by law enforcement.

“There's already the criminal justice concerns and concerns about mass surveillance, but also these private companies,” he told VPM News. “How long are they holding on to this data? What access do they have to it? What can they do with it?”

VSP has had multiple contracts with Dataminr, a social media surveillance company that partners with the social media platform X, including a recent $200,000 deal for licenses for its First Alert service.

Neither responded to questions about the contract. Josh Stanfield, who runs the Virginia Politics Revealed newsletter, said VSP estimated that it would cost over $1 million to respond to his public records request for communications between the agency and Dataminr.

Los Angeles police used Dataminr to track more than 50 Gaza-tied protests in the city between October 2023 and April 2024, according to reporting from the Intercept.

VSP’s 15-year contract with Tech5 — worth an estimated $54 million — aims to improve the agency’s fingerprint collection capabilities, according to a 2024 press release announcing the deal.

On Wednesday, a Tech5 representative told VPM News they would “check with the team and with VSP” about an interview on the contract. The company did not make anyone available before publication.

In 2019, Virginia Capitol Police reached a deal to beta test Liberty Defense Technologies’ HEXWAVE system at the State Capitol to scan visitors for weapons. The system uses a subset of AI known as machine learning, per StateScoop, which allows it to learn and improve from data it receives.

A Virginia Capitol Police spokesperson did not respond to an interview request.

When Youngkin announced his order establishing the state’s AI policies, his office touted the move: “As one of the first states in the country to issue AI standards, Virginia is leading the way on AI guidelines and pilots.”

“The Administration is working with its AI Task Force and stakeholders to finalize standards that address all key issues,” Youngkin spokesperson Peter Finocchio told VPM News.

Youngkin’s order also directed the public safety and homeland security secretary to provide AI standards to local law enforcement that ask for them.

Finocchio did not respond when asked why the rules are delayed and if there is a timeline for them to be finished. Youngkin leaves office in January 2026.

A Miyares spokesperson said any advice from the AG’s office “would be protected under attorney/client privilege.”

Virginia has a patchwork of AI rules in place, but it still doesn’t have comprehensive legislation regulating the technology. During the 2025 General Assembly session, one high-profile AI bill was vetoed and multiple others failed.

Youngkin vetoed a bill to establish rules for using “high-risk” AI systems aimed at providing protection from discrimination. If it passed, Virginia would have been the second state with sweeping AI consumer protection rules.

Among the legislation that didn’t advance out of committee was a bill to establish the Artificial Intelligence Transparency Act, which would have required developers to meet certain requirements for generative AI systems available in Virginia.

Corrected: June 5, 2025 at 10:37 AM EDT
A previous version of this story misspelled Steven Keener's name.
Dean Mirshahi is a general assignment reporter at VPM News.