A mom recently went viral after sharing a disturbing experience involving artificial intelligence deepfake technology.
Featured Video
TikTok user Sam (@sammci9) posted a video that has since garnered over 1.8 million views, discussing how a brand allegedly used AI to create a deepfake of her face for promotional purposes.
“That’s me,” she begins, showing a screenshot of the deepfake video in question. “That’s not my face because they used AI to make a deepfake of me.”
A brand created an AI deepfake using her likeness
Sam explains that the video was originally from her TikTok account, but the brand manipulated it.
“That is my video that is on this account that you are watching right now,“ she explains. “A brand took [it], made a deepfake of me using AI, and edited my face, and re-uploaded it to further promote their product.”
The experience led Sam to reflect on a growing concern for online safety.
“The internet is a scary place, and people ask ‘Why don’t you post your kids anymore?’” she adds.
Other potential misuses of AI ahead?
Sam explains that if a brand can do this to her, the potential misuse of videos featuring children is even more alarming.
“What would people do to videos of children? It’s very weird. It’s very scary. This is not normal.”
Although Sam was able to contact the brand and get the video taken down swiftly, she remains disturbed.
“Once I brought this to their attention, it was taken down within 30 minutes,” she notes. “This is more so to promote internet safety because if this is happening to me, think about what is happening to other videos, especially videos of children.”
As Sam puts it, the experience was “very weird” and a reminder to be cautious about what we post online.
How widespread is deepfake video technology?
Deepfake videos are fabricated, digitally produced videos created using artificial intelligence. Typically, these videos are generated only using a clear image, a voice sample, and AI technology.
According to Security Hero statistics, the number of deepfake videos exploded in 2023, with 95,820 deepfake videos online. That’s a 550% increase since 2019.
While the site reports that most deepfake videos are used for adult content, instances like the one involving the TikToker show that scams and other misuse are on the rise.
For instance, another woman went viral earlier this year when she shared that a brand used her likeness to create a deepfake video to promote erectile dysfunction pills.
Finally, Stanford University suggests that if you encounter a deepfake video, you should report it using the sources included in its blog post.
While there are no current federal laws specifically regulating deepfakes, various bills have been introduced to address the issue. One example is the DEFIANCE Act of 2024.
Viewers are outraged
In the comments, users expressed anger and frustration over Sam’s experience, with many speculating about which brand might have been responsible.
“Please speak to an attorney,” advised one user.
“Is this why random accounts are saving my videos every few days and it’s the same three videos?” wonders another. “I don’t even have any traction on this app so I don’t know why they keep doing it lol.”
“My mom cozy portable pumps broke so needless to say between that experience and this video I will NOT be purchasing anything from them again,” said another, speculating that the brand in question might be breast pump company MomCozy.
@sammci9 This is no shade to anyone who post their chidlren online, I just want everyone to be aware this stuff does and can happen so we should be mindful of what we share 🤍 #momsoftiktok #motherhood #internetsafety #parentsontiktok #parentsoftiktok #motherhoodjourney ♬ original sound – Sam | 🐮🍼
We’ve reached out to Sam via email and TikTok messaging for more information. We’ve also contacted MomCozy via email.
Internet culture is chaotic—but we’ll break it down for you in one daily email. Sign up for the Daily Dot’s web_crawlr newsletter. You’ll get the best (and worst) of the internet straight into your inbox.