close
close

Wyoming lawmakers are grappling with the debate over artificial intelligence regulation

(TNS) — State lawmakers across the country are tackling the issue of regulating deepfakes, a monumental feat to undertake as the law rushes to catch up with the development of advanced artificial intelligence technology.

Deepfakes, or digitally altered content generated using artificial intelligence, worry national security officials. The U.S. Department of Homeland Security (DHS), the National Security Agency and the FBI recently issued warnings about the rapidly evolving technology.

“Deepfakes and the misuse of synthetic content pose a clear, present, and evolving threat to society in the areas of national security, law enforcement, finance, and society,” the DHS report said.


An NPR story published in February reported that voters in New Hampshire received an automated phone call impersonating President Joe Biden that was intended to persuade voters not to vote in the upcoming Democratic primary.

Wyoming lawmakers on Monday discussed a bill sponsored by the Legislature’s Special Committee on Blockchain, Financial Technologies and Digital Innovation Technologies during the final 2024 budget session.

Senate Bill 51, “Unlawful Distribution of Deceptive Synthetic Media,” would prohibit the distribution of synthetic media, or deepfakes, with the intent to intentionally mislead people and spread disinformation. The Act required the publication of a disclaimer for any digital content altered using AI technology.

SF 51 made it halfway through the session before dying in a House committee, where some lawmakers argued the bill hindered individual freedom of expression.

“It’s real tension. It’s always a challenge when you put speech restrictions in place,” Sen. Chris Rothfuss of Laramie County told the Wyoming Tribune Eagle on Thursday. “How do you draw these lines? Where do you draw them? And how do you make sure you have a compelling state interest?”

Narrowing it down

Sen. Affie Ellis, R-Cheyenne, said Monday that her greatest concern is the bill’s broad language that could lead to unintended consequences. She stated that she would feel more comfortable with the bill if it were closely tailored to a specific area, such as election campaigns.

“If it was just tailored to the campaign… then we could be discussing defamation and slander,” Ellis said. “They fit better on their own than as one big bill, because that’s where I start to get lost and see the unintended consequences.”

Practicing attorney Matthew Kaufman, who previously worked on the House Blockchain Task Force in 2019, agreed with Ellis, adding that the broad language would render the bill useless. In his opinion, the form of tort contained in the draft law is an “exceptional circumstance”.

“I am concerned that we will create a standard that will be very difficult to meet,” Kaufman said.

It’s also an issue that lawmakers at both the state and federal levels recognize that the development of artificial intelligence and related governance regulations “is playing out in real time,” Kaufman said. Many of these issues have not yet been dictated at the federal level.

“I don’t want us to waste too much time on things that we simply can’t control,” Kaufman said.

Rothfuss previously told WTE he was confident about the bill as written. Current state and federal laws provide no remedy for people who have been misled, he said at the meeting, and that’s exactly what this bill was intended to do.

“Do we, as individuals, have the right to know whether something is true? You don’t have that right right now,” Rothfuss said.

If a Wyoming resident was misled by an out-of-state political candidate, such as a presidential candidate, Rothfuss questioned whether there was anything in current state law to protect the voter.

“In my opinion, it’s worth knowing whether the truth has been learned or not,” Rothfuss said. “But there is no mechanism in any law to provide a surety.”

Unintended consequences

Committee members considered various uses of artificial intelligence technology that, while potentially misleading, are considered entirely legitimate. Rep. Daniel Singh, R-Cheyenne, touched on the fact that AI-generated content is a common practice in advertising.

Office of Legislative Service attorney David Hopkinson said the bill specifies that information is considered misleading if a “reasonable person” cannot distinguish fact from fiction. Therefore, any AI-generated content in an advertisement, television program or film would be perceived by a reasonable person as fiction.

Kaufman said part of the problem is that the act can be broadly applied to any circumstance. Such a broad application of the Act makes it difficult to enforce the law or provide evidence of damage.

“I think about the unintended consequences,” Kaufman said. “They must “knowingly and intentionally” disseminate information with the intent to mislead. There’s a lot to (prove).”

This broad terminology could prove “cumbersome and useless” in court unless it is narrowly tailored, as Ellis suggested, to election campaign laws, he said. Certified public accountant David Pope said he sees “a lot of value in taking smaller bites.”

“If we have to limit ourselves to political activities in the near future, it will allow us to work through the language, work through the concepts and build a framework that I would like to see expanded in the future,” Pope said.

Committee members decided to return to the discussion of SF 51 at their next meeting, which will be held July 1-2 in Sheridan.

© 2024 Wyoming Tribune-Eagle (Cheyenne, Wyo.). Distributed by Tribune Content Agency, LLC.