AI ‘deepfakes’ of Hurricane Helene victims circulate on social media

AI ‘deepfakes’ of Hurricane Helene victims circulate on social media

In the wake of Hurricane Helene, misinformation has flooded the internet, including two doctored AI images of a desperate, sobbing child aboard a boat in supposed floodwaters.

At first glance, the photos floating around online simply show a child in a lifejacket holding a dog as rain from the storm — the worst to hit the US since Hurricane Katrina in 2005 — continues to drench them.

A closer look, however, reveals several discrepancies between the two nearly identical photos, as reported by Forbes.

Two similar photos of a child holding a puppy in apparent floodwaters were generated by AI in the aftermath of Hurricane Helene, contributing to a flood of misinformation that has followed the storm. Larry Avis West/Facebook

The “deepfake” images circulated online of a small child with a puppy, seemingly floating through floodwaters from Hurricane Helene. Larry Avis West/Facebook

In one photo, the child even has an extra, misplaced finger.

She is also wearing two different shirts and sits in a different type of boat in each photo. The pup’s coat is also slightly darker in one shot, which is also more blurred and pixelated.

Sen. Mike Lee of Utah was among those who fell for the photo, sharing it on X Thursday and writing, “Caption this photo.” He later deleted it after users pointed out that the image was fake.

One Facebook user also fell for the “deepfake” image, sharing it with the caption, “Dear God, help our babies and their families!”

Some commenters called out the obvious signs that it had been tampered with.

Manipulated images portraying disasters can have long-term consequences, complicate relief efforts, create false narratives, erode public trust in times of crisis and hurt real people, Forbes reported. They can also be used to scam people into donating to fake fundraisers, though it is not clear if the image of the child has been used for that purpose.

The AI-generated images take attention away from the real people effected by tragedies, experts say. Ben Hendren

An AI-generated image shared widely online in May depicted rows of neatly organized tents in Gaza with several tents in the center spelling out “All Eyes on Rafah.”

The fake photo was shared by tens of millions of people on social media, including Noble Peace Prize winner Malala Yousafzai and model Gigi Hadid, but critics say it failed to capture the reality of the war-torn region.

“People have been posting really graphic and disturbing content to raise awareness and that gets censored while a piece of synthetic media goes viral,” Deborah Brown, a senior researcher and advocate on digital rights for the Human Rights Watch group told the Los Angeles Times.

Doctored images can complicate disaster response efforts, create false narratives and erode public trust during times of crisis. Nathan Fish / USA TODAY NETWORK via Imagn Images

Other misinformation regarding Hurricane Helene has spiraled online, prompting FEMA to launch a “Rumor Response” page on its website, tackling falsehoods claiming the agency is confiscating survivors’ properties, distributing aid based on demographic characteristics, and seizing donations and supplies.

One conspiracy theorized the government used weather control technology to aim the hurricane at Republican voters, according to reports.

“Help keep yourself, your family and your community safe after Hurricane Helene by being aware of rumors and scams and sharing official information from trusted sources,” FEMA advised.

Originally Appeared Here