Scammers are using artificial intelligence to create deepfake videos that simulate dangerous situations involving family members, posing a significant threat to unsuspecting individuals.Consumer investigator Brian Roche, who has been exploring the use of AI in scams, demonstrated how easily these videos can be made.AI-generated videos can take just a few minutes to createDr. Peilong Li, a computer engineer and professor at Elizabethtown College, created an AI-generated video of himself in just a few minutes. “To be honest, hearing and seeing is no longer the best way to believe it and how it is,” Li said. Li also made a deepfake video of Roche, showing how scammers can manipulate video content to deceive viewers.Using a 30-second video provided by Roche, Li created a fake boss scam video targeting businesses. The video featured a message urging immediate action: “We’re in a high stakes meeting and we need to finalize this acquisition immediately! Wire the requested funds to the offshore account now or the entire deal is off.” Although the voice was slightly off, the video recreation was remarkably accurate.Li noted, “AI has helped the scammers to a point that they need zero entry knowledge to do things,” adding, “but now you just need a laptop and a motive, and that’s it.”Prepare in advance for the possibility of receiving deepfakesThe information needed to create such deepfake videos can be easily obtained from social media pages if privacy settings are not properly managed. To combat this threat, Li recommends that families prepare in advance for the possibility of receiving a deepfake video.”And my recommendation is every family should have this family code. As I mentioned, it’s very important. It is a zero-dollar solution to a multimillion problem,” he said.A family code could be a number or a phrase known only to family members that can be used to verify the authenticity of a situation. Alternatively, families can simply call the person who is allegedly kidnapped to confirm their safety.Question authenticity of what we see and hearThe ease with which deepfake videos can be created highlights the importance of questioning the authenticity of what we see and hear. Understanding how easily these videos are made should prompt us to remain vigilant and skeptical of digital content.
Scammers are using artificial intelligence to create deepfake videos that simulate dangerous situations involving family members, posing a significant threat to unsuspecting individuals.
Consumer investigator Brian Roche, who has been exploring the use of AI in scams, demonstrated how easily these videos can be made.
AI-generated videos can take just a few minutes to create
Dr. Peilong Li, a computer engineer and professor at Elizabethtown College, created an AI-generated video of himself in just a few minutes. “To be honest, hearing and seeing is no longer the best way to believe it and how it is,” Li said. Li also made a deepfake video of Roche, showing how scammers can manipulate video content to deceive viewers.
Using a 30-second video provided by Roche, Li created a fake boss scam video targeting businesses. The video featured a message urging immediate action: “We’re in a high stakes meeting and we need to finalize this acquisition immediately! Wire the requested funds to the offshore account now or the entire deal is off.” Although the voice was slightly off, the video recreation was remarkably accurate.
Li noted, “AI has helped the scammers to a point that they need zero entry knowledge to do things,” adding, “but now you just need a laptop and a motive, and that’s it.”
Prepare in advance for the possibility of receiving deepfakes
The information needed to create such deepfake videos can be easily obtained from social media pages if privacy settings are not properly managed. To combat this threat, Li recommends that families prepare in advance for the possibility of receiving a deepfake video.
“And my recommendation is every family should have this family code. As I mentioned, it’s very important. It is a zero-dollar solution to a multimillion problem,” he said.
A family code could be a number or a phrase known only to family members that can be used to verify the authenticity of a situation. Alternatively, families can simply call the person who is allegedly kidnapped to confirm their safety.
Question authenticity of what we see and hear
The ease with which deepfake videos can be created highlights the importance of questioning the authenticity of what we see and hear.
Understanding how easily these videos are made should prompt us to remain vigilant and skeptical of digital content.






