Youngkin signs executive directive on AI
ORM, VITA to establish use standards, look for testing opportunities
Gov. Glenn Youngkin has signed an executive directive around artificial intelligence, instructing the state government to establish use standards and identify opportunities for the government to use AI technologies.
Executive Directive No. 5, announced by the governor’s office late Wednesday, directs the Office of Regulatory Management to work with the state’s chief information officer — Bob Osmond, who leads the Virginia Information Technologies Agency (VITA) — and relevant secretariats to review AI standards and piloting opportunities across four areas. It gives the ORM and CIO a deadline of Dec. 15 to deliver recommendations.
“Virginia is a leader in technology and will stay a leader in technology. The increasing use of AI, especially generative AI, offers tremendous opportunities to transform the way we serve all Virginians, from launching innovative, personalized education tools to improving customer service and beyond,” Youngkin said in a statement. “At the same time, we must ensure that these AI products and technologies have appropriate standards and guardrails to protect individual privacy rights in a transparent manner.”
Youngkin created the ORM through an executive order on July 1, 2022. He appointed Andrew Wheeler, the U.S. Environmental Protection Agency administrator under President Donald Trump, to head it following the Virginia Senate’s rejection of Wheeler as Youngkin’s pick for state secretary of natural and historic resources in February 2022.
“The commonwealth is home to one of the most innovative workforces and some of the most critical national security institutions in our country,” Wheeler said in a statement. “Together with our academic research institutions, Virginia can lead the way in the transparent and innovative use of AI nationally.”
The first focus area is a legal and regulatory review. According to the directive, the ORM and CIO, working with the state attorney general’s office, will review existing laws and regulations that may apply to AI and determine if updates are necessary; ensure the state government’s use of AI has “sufficient safeguards in place to protect individual privacy rights”; and make recommendations for uniform standards of AI use across the state government.
The second area of focus is education and workforce development. The ORM and CIO will work with the Virginia Department of Education, the State Council of Higher Education for Virginia and other higher education institutions to promote guidelines for AI use in learning and prohibit cheating; examine AI tools for personalized tutoring, especially in K-12 education; and include AI-related topics in K-12 and higher education courses. They are also directed to examine efforts to include AI technologies in workforce development.
Third, the ORM and CIO should focus on the modernization of state government, including identifying opportunities for AI to improve state government education.
The fourth area of focus is economic development and job creation. Working with the Virginia Economic Development Partnership, the ORM and CIO should identify potential industry clusters that could benefit from AI; explore ways to encourage AI innovation and entrepreneurship, such as through incubators and accelerators; assess the risks and opportunities of AI on the labor market, including which jobs might be displaced and which could be created; develop strategies to support impacted workers; and coordinate with schools and workforce programs on the steps to develop an AI-ready next generation.
In that same focus, the ORM and CIO will work with the Virginia Department of Energy to examine the expected increase in energy demands resulting from increased computing capacity requirements needed for increased AI adoption.
Virginia is not alone in examining AI usage regulations. In the 2023 legislative session, 14 states and Puerto Rico adopted resolutions or enacted legislation focused on AI, according to a July publication from the National Conference of State Legislatures.
Recent legislation in Connecticut is similar to Youngkin’s executive directive. It required the Connecticut Department of Administrative Services to conduct an inventory of systems that use AI and are being used by state agencies and to perform ongoing assessments of those systems beginning Feb. 1, 2024. Connecticut also required its Office of Policy and Management to establish AI system usage policies for state agencies.
Texas, North Dakota, Puerto Rico and West Virginia have created advisory councils to study AI systems used by state agencies.