Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations

Virginia legislation calls for human oversight of AI use in court decisions

Del. Cliff Hayes, D-Chesapeake, makes his way the General Assembly Building in the rain.
Shaban Athuman
/
VPM News
Del. Cliff Hayes, D-Chesapeake, makes his way the General Assembly Building following a General Assembly session on Wednesday, February 12, 2025 in Richmond, Virginia. Hayes sponsored a bill that would require human oversight when using artificial intelligence in court decisions.

Researchers warn of potential biases in AI algorithms.

Virginia lawmakers want to regulate the use of artificial intelligence-based tools in the criminal justice system.

During this year’s General Assembly session, Del. Cliff Hayes Jr. (D–Chesapeake) introduced a bill that would reinforce human oversight in the criminal justice system while allowing AI to play a supporting role.

Hayes’ bill would prohibit AI-generated recommendations from being used as the sole basis for key decisions related to pre-trial detention or release, prosecution, adjudication, sentencing, probation, parole, correctional supervision, or rehabilitation. It would also make any use of AI in those decisions subject to legal challenge or objection.

Hayes, who has worked in technology management for three decades and witnessed the rapid growth of AI, said the dependence on AI has accelerated recently.

“AI definitely offers great benefits,” Hayes said. “But there’s another side to that coin. In some cases, we know AI, when it’s not accurate, can be extremely damaging and harmful.”

Hayes questioned whether the government should be in this “sandbox,” experimenting with people's court cases, which could have a significant effect on their livelihood.

“I think we need to continue to have human oversight in those cases, qualified human oversight,” Hayes said. “The people who today are qualified to make those judgments, those decisions, should be the same individuals to make those determinations, and not rely 100% on AI.”

At least 26 states allow law enforcement to run facial recognition searches against driver’s license and identification databases, according todata from the Center on Privacy & Technology at Georgetown Law. Virginia law currently allows for use of facial recognition technology.

Sixteen states allow the FBI to use the technology to find suspects in a “virtual lineup,” according to the Georgetown study. Over 117 million American adults are included in these facial recognition networks.

A separate study by theNational Institute of Standards and Technology found that Black and Asian people are anywhere from 10 to 100 times more likely to be misidentified than white people, depending on the facial recognition algorithm.

“We’re a system that disproportionately incarcerates people of color, especially Black men,” said Steven Keener, assistant professor of criminology at Christopher Newport University, and director of the university’s Center for Crime, Equity, and Justice Research and Policy.

The goal of AI software is to reduce bias and racism in the system, according to Keener. But research has found many AI tools and algorithms are biased, he said. The data going into the software to create AI tools and algorithms could be biased, which impacts the data output used to make important decisions — such as who is eligible for bail.

“What data set are you using to build the algorithm that determines who is safe and who is unsafe?” Keener said.

AI systems are not yet capable enough to make such tough decisions by themselves, according to Sanmay Das, a computer science professor at Virginia Tech, and associate director of AI for social impact at the Sangani Center for Artificial Intelligence and Data Analytics.

“I think the key point over there is accountability, right?” Das said. “If you did not have human oversight, it’s really easy to blame the machine, or the algorithm.”

Humans cannot replace bureaucracies with AI, Das said, even though some tools can be helpful. The speed and scale at which AI operates could lead to catastrophic outcomes when making decisions that affect thousands of people could be catastrophic.

“I think AI tools can be enormously helpful in many of these kinds of domains,” Das said. “But, I think that we’re going to need to deal with this challenge that people may be tempted to use them and apply them at really grand scales in order to save human time, to save human effort.”

Gov. Glenn Youngkin has until March 24 to review, amend, sign or veto the legislation.

Capital News Service is a program of Virginia Commonwealth University’s Robertson School of Media and Culture. Students in the program provide state government coverage for a variety of media outlets in Virginia.

Capital News Service is a program of Virginia Commonwealth University's Robertson School of Media and Culture.
You Might Also Like

Support Local News and Stories: How You Help Sustain VPM

Community members – like you – sustain VPM so we can deliver local news coverage, educational programming and inspiring stories. Your donations make it possible.

Support Now
CTA Image