The latest headlines from our reporters across the US sent straight to your inbox each weekday
Your briefing on the latest headlines from across the US
Last October, 14-year-old Elliston Berry woke up to a nightmare.
The teen’s phone was flooded with calls and texts telling her that someone had shared fake nude images of her on Snapchat and other social media platforms.
“I was told it went around the whole school,” Berry, from Texas, told Fox News. “And it was just so scary going through classes and attending school, because just the fear of everyone seeing these images, it created so much anxiety.”
The photos were AI-generated – what’s known as deepfakes. These generated images and videos have become frighteningly prevalent in recent years. Deepfakes are made to look hyper-realistic and are often used to impersonate major public figures or create fake pornography. But they can also cause significant harm to regular people.
Berry, now 15, is calling on lawmakers to write criminal penalties into law for perpetrators to protect future victims of deepfake images.
The teen told the outlet that after discovering what had happened, she immediately went to her parents. Her mother, Anna McAdams, told Fox News she knew the images were fake. McAdams then reached out to Snapchat several times over an eight-month period to have the photos removed.
Elliston Berry (left) and her mother Anna McAdams (right) say federal protections are needed to protect victims of deep-fake porn. Berry was just 14-years-old when a classmate distributed AI-generated nudes of her last year (CNN)
While the deepfakes of Berry were eventually taken down, McAdams told CNN, the classmate who distributed them is facing few repercussions.
“This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.
This week, Republican Senator Ted Cruz, Democratic Senator Amy Klobuchar and several colleagues co-sponsored a bill that would require social media companies to take down deep-fake pornography within two days of getting a report.
The Take It Down Act would also make it a felony to distribute these images, Cruz told Fox News. Perpetrators who target adults could face up to two years in prison, while those who target children could face three years.
Cruz said that what happened to Berry “is a sick and twisted pattern that is getting more and more common.”
“[The bill] puts a legal obligation on the big tech companies to take it down, to remove the images when the victim or the victim’s family asks for it,” Cruz said. “Elliston’s Mom went to Snapchat over and over and over again, and Snapchat just said, ‘Go jump in a lake.’ They just ignored them for eight months.”
The Independent has contacted Snapchat’s parent company, Snap Inc, for comment.
The mom and daughter say legislation is essential to protecting future victims, and could have meant more serious consequences for the classmate who shared the deep-fakes.
“If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail…so he would get a punishment for what he actually did,” McAdams told CNN.
While the photos are now off Snapchat, Berry says she is terrified they will resurface if the Take it Down Act is passed.
“It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said. “By this bill getting passed, I will no longer have to live in fear knowing that whoever does bring these images up will be punished.”