FBI Special Agent discusses deepfake threats to children

FBI Special Agent discusses deepfake threats to children

ALBANY, N.Y. (NEWS10) — Deepfakes are synthetic media that use artificial intelligence to create realistic videos, pictures, audio, and text of events that never happened. Supervisory Special Agent Samantha Baltzersen at the FBI’s Albany office explained the potential threats this technology poses to children.

Get the latest news, weather, sports and entertainment delivered right to your inbox!

Baltzersen, who leads the Cyber Squad and Task Force at the Albany FBI, advised parents to maintain an open dialogue with their families about the dangers of AI. “Just having the conversation constantly about what you put out there about yourself…you don’t control it once it’s posted and what can possibly be done with it. With images, someone can now take your image and do just about anything with it.”

In late March, the FBI issued a public service announcement to remind would-be perpetrators that explicit images of children–even those created with generative AI–are illegal and will be prosecuted. In September, New York State Governor Kathy Hochul signed a bill into law making it illegal to disseminate AI-generated explicit images of a person without their consent. The law also gives victims the right to pursue legal action against perpetrators.

Baltzersen says current laws are not enough to protect internet users, “the laws are going to punish people after the fact…the image is still out there.”

Finding victims willing to come forward remains challenging. “People are hiding this for the most part right now. It’s already happening, but it’s just not being talked about. Most people aren’t going to come out and say, hey, this video was made of me,” Baltzersen explained.

She stressed the importance of children feeling safe coming to adults for help if explicit photos of them do surface online. “Let them know: we’re not here to punish you. We’re here to help you. Come talk to us and let us know when something like this is happening so we can help you deal with it. The last thing you want is your children being extorted or feeling like they have nowhere to turn.”

Victims can get help through organizations like the National Center for Missing and Exploited Children (NCMEC). “NCMEC has a program where they’ll help have that information pulled down. Most of the time they will. But there’s no guarantee that it’s off of every server all over the internet. And if somebody saves it, they can always re-upload it later,” Baltzersen noted.

She concluded by advising vigilance regarding online activity. “Watch what you post, who you share it with, how public it is, you know, consider making things really, you know, limited to your friends and family as opposed to more public, and then be aware of what other people are posting about you and your family.”

A study by Sensity AI found that approximately 90-95% of deepfake videos since 2018 were primarily based on non-consensual pornography.

Originally Appeared Here