Scammers are using AI voice-cloning apps to create convincing deepfake scams, leaving families vulnerable to fraudulent calls from voices that sound like their loved ones.With deepfake technology becoming increasingly convincing, Consumer Reports said people should take steps to protect themselves.”Over the last few years, there’s been an explosion of calls claiming that, ‘We have your daughter. She’s in trouble. Send money or else,'” said Ben Colman, cofounder and CEO of Reality Defender. “Well, what’s happened recently is the call comes in and says, we are your daughter. ‘Hi, I’m your daughter. I’m in trouble, send money right now.'”Colman said scammers do some work to build a convincing fake persona to use in the scam.>> Download the free WMUR app to get updates on the go: Apple | Google Play <<“A deepfake is taking anyone’s likeness, whether it’s their face, a single image from LinkedIn or online or a few seconds of audio, and using a pre-trained model, replicating their likeness to make them say or do anything you want,” he said.Deepfakes are so advanced that even experts find it difficult to tell the difference. Alarmingly, there are no federal laws preventing someone from cloning your voice without permission. Consumer Reports reviewed six popular voice-cloning apps and uncovered a troubling trend. “So, four of the six apps had no meaningful way to ensure that the user had the original speaker’s consent to voice clone them,” said Derek Kravitz from Consumer Reports. “And the two other apps were better, had more safeguards, but we found ways around them.”While erasing your digital footprint is practically impossible, Consumer Reports suggests several steps to protect yourself. “The first thing is just knowing that deepfake scams like this exist,” Kravitz said. “The second thing is using two-factor authentication on all of your financial accounts. That means having an extra security feature on your smartphone device that requires you to input a security code or respond to an email when trying to gain access to your bank accounts. And then the third thing is just being wary of calls, any type of texts or any type of emails that are asking you for your personal financial information or just personal data.”Finally, Colman advised doing a gut check to assess whether what you’re hearing or seeing makes sense. “By default, you should not believe anything you see online,” he said. “You should always follow just standard common sense.”
MANCHESTER, N.H. —
Scammers are using AI voice-cloning apps to create convincing deepfake scams, leaving families vulnerable to fraudulent calls from voices that sound like their loved ones.
With deepfake technology becoming increasingly convincing, Consumer Reports said people should take steps to protect themselves.
“Over the last few years, there’s been an explosion of calls claiming that, ‘We have your daughter. She’s in trouble. Send money or else,'” said Ben Colman, cofounder and CEO of Reality Defender. “Well, what’s happened recently is the call comes in and says, we are your daughter. ‘Hi, I’m your daughter. I’m in trouble, send money right now.'”
Colman said scammers do some work to build a convincing fake persona to use in the scam.
>> Download the free WMUR app to get updates on the go: Apple | Google Play <<
“A deepfake is taking anyone’s likeness, whether it’s their face, a single image from LinkedIn or online or a few seconds of audio, and using a pre-trained model, replicating their likeness to make them say or do anything you want,” he said.
Deepfakes are so advanced that even experts find it difficult to tell the difference. Alarmingly, there are no federal laws preventing someone from cloning your voice without permission.
Consumer Reports reviewed six popular voice-cloning apps and uncovered a troubling trend.
“So, four of the six apps had no meaningful way to ensure that the user had the original speaker’s consent to voice clone them,” said Derek Kravitz from Consumer Reports. “And the two other apps were better, had more safeguards, but we found ways around them.”
While erasing your digital footprint is practically impossible, Consumer Reports suggests several steps to protect yourself.
“The first thing is just knowing that deepfake scams like this exist,” Kravitz said. “The second thing is using two-factor authentication on all of your financial accounts. That means having an extra security feature on your smartphone device that requires you to input a security code or respond to an email when trying to gain access to your bank accounts. And then the third thing is just being wary of calls, any type of texts or any type of emails that are asking you for your personal financial information or just personal data.”
Finally, Colman advised doing a gut check to assess whether what you’re hearing or seeing makes sense.
“By default, you should not believe anything you see online,” he said. “You should always follow just standard common sense.”