Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations

General Assembly, Youngkin’s admin are already working on AI policy

Person speaks into microphone
Steve Helber
/
AP
Andrew Wheeler, head of the Office of Regulatory Management, was initially nominated by Gov. Glenn Youngkin in 2022 to be the Secretary of Natural and Historic Resources.

The Office of Regulatory Management delivered a report to the governor on Thursday.

Both Gov. Glenn Youngkin’s administration and legislators are developing standards for artificial intelligence, policymakers tell VPM News.

In September, Youngkin tasked the Office of Regulatory Management with writing recommendations for state policies on AI in an executive directive. The recommendations, which were due Friday, were delivered to Youngkin on Thursday according to ORM head Andrew Wheeler.

“The governor had some questions. And we will be fine tuning some of the documents before that before they'll be released publicly,” said Wheeler.

While AI has been around for years, the prevalence of “generative AI,” which creates new content based off of data it accesses, such as images, text or sound.

Youngkin’s executive directive comes as generative AI poses risks and opportunities for workers and companies, and the federal government reviews public comment on draft guidance.

“The technology is outpacing [regulations] and laws, and so sometimes it’s hard to have specificity and a fast-moving area of technology,” said Cayce Myers, a communications professor at Virginia Tech.

While the draft recommendations were not publicly available as of Friday, Youngkin’s original directive tasked ORM to look mostly at how AI is used by government in the delivery of services and in workforce development and education. Wheeler outlined that in an interview Thursday that was light on specifics.

The Virginia draft recommendations cover questions like how data is protected, new energy supply needs for data centers, workforce recommendations, and how to disclose to Virginians when artificial intelligence played a role in state services.

“The governor sees AI as an incredible tool, but he also sees some inherent risks,” Wheeler said. “He wants to make sure that we have those safeguards in place before … we do any kind of full utilization of AI.”

The recommendations also look to prepare K-12 students for a future with artificial intelligence, in which they use it “thoughtfully and responsibly,” said Wheeler. He added that the governor is interested in striking a balance between student use of generative AI to complete assignments and being prepared to use it in the workforce.

Del. Michelle Maldonado of Manassas, one of the founders of the Virginia Technology and Innovation Caucus, didn’t express noted concern that the recommendations might not be made available to legislators.

“I would like to see some information, some sharing some transparency, but I also want it to be responsible, and I want it to be accurate,” she said. “And I want it to be helpful as we [on] parallel tracks try to figure out how to create the frameworks that are appropriate and necessary for the commonwealth.”

The Democrat said that upcoming legislation on AI this session would deal with assessing privacy and cybersecurity concerns, but a significant part of the work on the issue for legislators begins with defining artificial intelligence.

“As legislators, it’s part of our job to help educate people. And that’s part of what we want to do — is provide policy and resources in place to help with digital competency, digital literacy,” she said.

Maldonado says legislation coming up would also deal with deepfakes, audiovisual content that poses as public figures, such as those during an election season.

While Youngkin’s executive directive did not include specific instructions on law enforcement applications of AI, Maldonado said that it also needed to be considered.

Myers with Virginia Tech says developing guidelines for AI in policing will need extra care.

“What you're looking at is a very sensitive area, and one in which you do not want to get it wrong with AI, because you're dealing with people's life and liberty,” he said. “In that particular high-stakes system, there seems to be a thought that having a human being making a decision is much better than an algorithm. And there's a real concern around bias and discrimination in that area.”

The National Institute of Standards and Technology, a commerce department agency, said that bias can arise from existing data sets — and also from human factors.

Wheeler said that law enforcement’s use of AI will get its own, separate process, and that standards would address identifying bias in any AI software products or data sets used by state-purchased software.

“We want to take a hard look at that,” he said. “And I believe we’ll probably end up developing some standards for the use of AI by law enforcement.”

Jahd Khalil covers Virginia state politics for VPM News.