Omaha victim of deepfake images reacts to ‘Take It Down’ Act

Omaha victim of deepfake images reacts to ‘Take It Down’ Act


An Omaha woman says a new federal law will give her and so many others more protection. President Donald Trump signed the ‘Take It Down’ Act Monday. It makes it illegal to publish explicit images of a person that were made with artificial intelligence.Kayla Medeiros doesn’t know if deepfake images that were created of her have been published. But if she does find them on the internet, she says police can now actually do something about it.”I’m constantly thinking if it’s out there,” she told KETV Newswatch 7 this week.Medeiros is very aware that the internet can be a dangerous place.”I often Google reverse image search some of them because I have anxiety that it’s out there and I won’t know,” she said.KETV Investigates talked with Medeiros in February after she said she found photos manipulated to make her look undressed.”I am very glad that I shared my story. I think part of that helped me heal,” she said.Medieros said she saw the photos on an old iPad she and her ex-boyfriend shared together. She told police about the images, but they told her no laws were in place for adult victims of explicit images made with artificial intelligence, also known as deepfakes.”Basically, they said unless it’s posted somewhere, and I see that it’s posted somewhere, there’s nothing really to be done about it,” she said.But that changed Monday when President Trump signed the ‘Take It Down’ Act. The bipartisan bill creates stricter penalties for the spread of explicit deepfake images.”If it does get posted somewhere now, it’s illegal. So, I mean, I hope it doesn’t, but if it does, now there’s something I can do about it,” she said.Medieros still wants to see something done about the apps that allow people to create the content.”People are using it for the wrong reasons,” she told KETV.But she said the law creates meaningful change, and she hopes her story can empower others in similar situations.”Don’t feel like you’re helpless, don’t just throw in the towel. because for me, I knew there weren’t laws, but I still made the report, made a paper trail because you never know what will come up”, Medeiros said.The penalty for someone who spreads deepfake explicit images can range from a fine to prison time, or both. Under this new law, websites and companies will have 48 hours to remove deepfakes after a victim requests it. NAVIGATE: Home | Weather | Local News | National | Sports | Newscasts on demand |

An Omaha woman says a new federal law will give her and so many others more protection.

President Donald Trump signed the ‘Take It Down’ Act Monday. It makes it illegal to publish explicit images of a person that were made with artificial intelligence.

Kayla Medeiros doesn’t know if deepfake images that were created of her have been published. But if she does find them on the internet, she says police can now actually do something about it.

“I’m constantly thinking if it’s out there,” she told KETV Newswatch 7 this week.

Medeiros is very aware that the internet can be a dangerous place.

“I often Google reverse image search some of them because I have anxiety that it’s out there and I won’t know,” she said.

KETV Investigates talked with Medeiros in February after she said she found photos manipulated to make her look undressed.

“I am very glad that I shared my story. I think part of that helped me heal,” she said.

Medieros said she saw the photos on an old iPad she and her ex-boyfriend shared together. She told police about the images, but they told her no laws were in place for adult victims of explicit images made with artificial intelligence, also known as deepfakes.

“Basically, they said unless it’s posted somewhere, and I see that it’s posted somewhere, there’s nothing really to be done about it,” she said.

But that changed Monday when President Trump signed the ‘Take It Down’ Act. The bipartisan bill creates stricter penalties for the spread of explicit deepfake images.

“If it does get posted somewhere now, it’s illegal. So, I mean, I hope it doesn’t, but if it does, now there’s something I can do about it,” she said.

Medieros still wants to see something done about the apps that allow people to create the content.

“People are using it for the wrong reasons,” she told KETV.

But she said the law creates meaningful change, and she hopes her story can empower others in similar situations.

“Don’t feel like you’re helpless, don’t just throw in the towel. because for me, I knew there weren’t laws, but I still made the report, made a paper trail because you never know what will come up”, Medeiros said.

The penalty for someone who spreads deepfake explicit images can range from a fine to prison time, or both. Under this new law, websites and companies will have 48 hours to remove deepfakes after a victim requests it.

NAVIGATE: Home | Weather | Local News | National | Sports | Newscasts on demand |



Content Curated Originally From Here