The political consultant that used deepfake generative artificial intelligence to mimic President Joe Bidenâs voice and send robocall messages two days before the New Hampshire presidential primary now faces more than two dozen criminal charges and a $6 million fine.
Steven Kramer, a longtime Democratic political operative who was working with Bidenâs primary challenger, Rep. Dean Phillips, has admitted to being behind a robocall message sent to voters in New Hampshire. The robocall, in an AI-generated message that sounded like Bidenâs voice, falsely insinuated that voting in the New Hampshire presidential primary would mean voters could not vote in November.
Kramer is now charged with 13 felony counts of voter suppression and 13 misdemeanor counts of impersonation of a candidate, across four counties, according to the New Hampshire Attorney General. The criminal charges against Kramer â filed in Belknap, Grafton, Merrimack and Rockingham counties â are each tied to a specific voter and allege that he âknowingly attempted to prevent or deterâ each voter from voting âbased on fraudulent, deceptive, misleading or spurious grounds or information.â
âTwo days before the New Hampshire 2024 presidential primary election, illegally spoofed and malicious robocalls carried a deepfake audio recording of President Bidenâs cloned voice telling prospective voters not to vote in the upcoming primary,â wrote the Federal Communications Commission in a statement released Wednesday.
The FCC said the fine it proposed for Kramer is its first involving generative AI technology. The company accused of transmitting the calls, Lingo Telecom, also faces a $2 million fine, although in both cases the parties could settle or further negotiate.
âWe will act swiftly and decisively to ensure that bad actors cannot use U.S. telecommunications networks to facilitate the misuse of generative AI technology to interfere with elections, defraud consumers, or compromise sensitive data,â said Loyaan A. Egal, chief of the Enforcement Bureau and chair of the Privacy and Data Protection Task Force. âWe thank our partners at the New Hampshire Attorney Generalâs Office for their help with this investigation.â
FCC Chairwoman Jessica Rosenworcel said regulators are committed to helping states go after perpetrators. In a statement, she called the robocalls âunnerving.â
âBecause when a caller sounds like a politician you know, a celebrity you like, or a family member who is familiar, any one of us could be tricked into believing something that is not true with calls using AI technology,â she said in a statement to media. âIt is exactly how the bad actors behind these junk calls with manipulated voices want you to react.â
She also said that the FCC actions were âonly a start,â because âAI technologies that make it cheap and easy to flood our networks with fake stuff are being used in so many ways here and abroad.â
Shortly after New Hampshireâs primary, the agency outlawed robocalls that contain voices generated by artificial intelligence. A bill requiring disclosure of the use of AI in audio or visual political ads was passed in Utahâs 2024 legislative session and took effect May 1. The bill provides for a private right of action and penalty of $1,000 per violation.