How to spot spot AI generated images
Images generated by artificial intelligence can be convincing at first. Here are some things to look for if you’re unsure if the image is real or not.
Students at a high school in the Western Dubuque Community School District went in front of school officials earlier this week to talk about other students creating fake nude images of them using artificial intelligence.
“I’m worried that every time they see me, they see those photos,” said Harper, one of four students who spoke out on April 28, KCRG reported. “I didn’t speak up before because I was afraid of being judged by others. I am speaking now because I need to fight for what is right.”
The Dubuque County Sheriff’s Office confirmed students had used apps to create fake nude photos, also known as “deepfakes,” of several female students at Cascade High School.
Deepfakes are photos, videos, or audio altered or created by AI to appear real, often without the subject’s consent. Many of the images are manipulated to put people in compromising situations, showing them appearing inappropriately or putting them in places that could spark controversy or embarrassment. The images have become a major cause for concern with the explosion of AI technology.
The district hosted a special session to discuss what changes should be implemented to school policies, KCRG reported.
“Throughout all of my years of school, I could have never imagined I would be involved in a situation so disgusting and violating as this,” Maggie, a student who spoke at the meeting, said to district officials.
Parents and students alike expressed that they believed the school district was not doing enough to support students that have been targeted.
“No one at school has talked to me or offered any counseling,” another student named Emma said at the meeting. “I feel like we have been silenced at school, as teachers have said we aren’t allowed to talk about it at school. I put a lot of time and effort into my education and extra activities at Cascade High School. I wish I could say Cascade High School does the same for me.”
District Superintendent Dan Butler expressed his concern with the situation at the meeting, KCRG reported.
“Our hearts are bleeding with yours,” Butler said. “This is an awful situation. We’re here to work with you to collaborate and move forward in the best way that we can, but we want to do that together.”
As local school districts try to figure out how to stop AI generated explicit photos of students, a bill to criminalize deepfakes is headed to President Donald Trump’s desk after sailing through both chambers of Congress with near-unanimous approval.
“The Take It Down Act” has enjoyed uncommon bipartisan support, along with a key endorsement from the first lady.
“It’s heartbreaking to witness young teens, especially girls, grappling with the overwhelming challenges posed by malicious online content, like deepfakes,” Melania Trump said during a rare public appearance on Capitol Hill on March 3 to lobby for the legislation.
The newly passed bill will require technology platforms to remove reported “nonconsensual, sexually exploitative images” within 48 hours of receiving a valid request. Sens. Ted Cruz, R-Texas, and Amy Klobuchar, D-Minnesota, introduced the legislation in August.
The Senate passed the Take It Down Act in February with unanimous consent. The House followed suit on April 28, approving it 409-2.
The president is expected to sign the bill into law.
A law that Gov. Kim Reynolds signed in April 2024 made creating media that depicts minors in a sexual act a felony. The bill also requires anyone over the age 18 who is convicted of the crime to register as a sex offender.
USA TODAY contributed to this report.
José Mendiola is a breaking news reporter for the Register. Reach him at jmendiola@dmreg.com or follow him on X @mendiola_news.