Michigan bills to regulate the use of artificial intelligence in campaign ads advanced out of the House Elections Committee Tuesday.
The bipartisan-sponsored package would generally require disclaimer messages in AI-generated political content, whether that be audio or visual.
It would also ban the use of deepfake technology to falsify someone’s speech or behavior if certain conditions are met. For example, it wouldn’t be allowed to occur within 90 days of an election.
Representative Penelope Tsernoglou (D-East Lansing) said unchecked usage of AI can be dangerous for democracy.
“No matter who we think is the best candidate, or whose views we agree with, those are their views and voters should know the truth of what those candidates want to get out there and shouldn’t be influenced by something that’s completely false,” Tsernoglou told reporters after Tuesday’s committee hearing.
During the meeting, Tsernoglou played several examples of deep-faked political videos for the House Elections Committee, which she chairs. To drive home the point that it's easy to fake someone’s voice, she played a fake audio message from “President Joe Biden” telling her to greet her kids for him.
The issue of AI and deep fakes has already come up in some out-of-state campaign materials. Notably, the Presidential Campaign for Florida Governor Ron DeSantis shared deep fake images of former President Donald Trump hugging former National Institute of Allergy and Infectious Diseases Director Anthony Fauci this past summer.
Tsernoglou told reporters that type of behavior could potentially fall under her package, which sponsors say aims to punish those who create and distribute falsified content, not people who unknowingly share it on social media.
“If it’s within 90 days of an election, and it’s available to Michigan viewers on [social media], then that (legislation) would kick in,” Tsernoglou said.
Under the package, people could be charged with offenses anywhere between a civil infraction and a five-year felony, depending on the offense and frequency of the violation.
Despite the bipartisan co-sponsorship of the bills, both Republican members of the House Elections Committee approved when it came to voting the package out of committee Tuesday.
Representative Jay DeBoyer (R-Clay) said he’s not against the legislation and sees the value in it. But he said he has some logistical concerns he’d like to see addressed -- for example, a provision defining how someone would prove content was fake or allowing the state or a candidate to seek a stay against potentially deceptive materials.
“If my opponent puts something up that’s actually true and I go to court to get an order from a judge for injunctive relief and they take it down and then the election occurs, and then by the time we go to court and we get it all sorted out, the election’s over,” DeBoyer said.
The legislation provides for sanctions but DeBoyer said that often doesn’t happen at the circuit court level.
Tsernoglou stressed the Attorney General or a candidate in question would still have to prove that material was faked to get a judgment in their favor.
“You can have witnesses, you can have evidence in court, just like with anything else, if one person says someone did something and they say they didn’t, the court determines that,” Tsernoglou said.
Though the bills advanced to the full House floor Tuesday, it’s unclear how soon they could pass the chamber. During the committee meeting, the sponsors said there’s still work to be done and that they’re open to potential changes.
Tsernoglou explained to reporters the rush to get the package out of committee.
“We think it’s important that they’re in place before any elections next year. So we need to keep moving them forward if we want that to happen. Because AI is here, deepfakes are here, they’re happening now. We have to address them,” she said.
Non-commercial, fact based reporting is made possible by your financial support. Make your donation to WEMU today to keep your community NPR station thriving.
Like 89.1 WEMU on Facebook and follow us on Twitter
Contact WEMU News at 734.487.3363 or email us at studio@wemu.org