The cruel cost for women of the AI boom? Deepfake porn images all over the internet (and there could even be some out there of you)

The cruel cost for women of the AI boom? Deepfake porn images all over the internet (and there could even be some out there of you)


It takes less than ten minutes for me to produce the fake nude picture of myself. I feed a photo of me arriving at an awards ceremony, wearing red lipstick and a mid-length black dress with beaded sleeves, into an app.

A short while later, without spending a penny, the dress is gone and I am staring at a highly realistic image of myself apparently completely naked on the red carpet.

If I didn’t know that it wasn’t my body – not my bikini tan line or my belly button – I would be completely fooled.

This is disturbing enough. But I take my experiment further. On another website I upload a photograph of myself and, for just £14, it is integrated into one of dozens of porn videos. It looks frighteningly real.

I know this sounds stupid; I know I should have been prepared. But even though they are not real, there is something horrifying about seeing those videos of yourself that you cannot imagine until it happens to you.

I start to sweat and my heart rate rises. I feel like I need a shower. I delete the video and close the page. But still my muscles tense, my throat tightens and I am catapulted back to September 2020, when my book Men Who Hate Women – about how violence and hatred towards women is stoked on internet forums – had just been published.

It was the height of the pandemic and I did a lot of talks and interviews online. Not long after, the abuse started. A lot was the sort of thing I was used to. Pictures of men holding machetes saying they were coming for me. Casual discussions about the best way to hang me.

Then something different. A picture of myself with a man carrying out a sex act on me. Even now it makes me shiver. Even now it feels like a violation. There’s still shock, disgust, fear and shame every time I see it.

Laura Bates's latest book, The New Age of Sexism, explores how technological advances are affecting women

Laura Bates’s latest book, The New Age of Sexism, explores how technological advances are affecting women

Deepfake images and videos are rife online, taking just minutes to create in some instances

Deepfake images and videos are rife online, taking just minutes to create in some instances

And the worst thought of all; these images might outlive me, might be my legacy – and there is nothing I can do about it.

I wanted to investigate deepfakes – digitally manipulated images and videos giving the false appearance of a person doing or saying something they didn’t do – for my latest book, The New Age Of Sexism.

This explores how technological advances – in particular AI – are impacting women. While many believe we are making progress towards gender equality, in reality a new, powerful and readily accessible tool to enable abuse and oppression of women and girls is exploding under our noses.

A recent study found 143,000 deepfake videos on 40 popular websites were viewed 4.2billion times. These videos aren’t just of celebrities; one poll found 63 per cent of men who would consider making deepfake porn would use images of women they knew. All the perpetrator needs is a photo.

It could happen – indeed, is happening – to anyone. Women like you, your daughter, or your granddaughter. If we do nothing about it, I fear the violation of women will be coded into the fabric of the future.

Of all the abuse I receive, deepfake images and videos stay with me most. I think a lot of people who dismiss deepfake pornography as harmless cannot truly imagine how it would feel if it happened to them.

First is the total shock. The panic and desperation. Then fear sets in. This is ‘out there’. How many people have seen it? Oh God, what if your parents see it? You feel you’re going to be sick. You should report it. You should delete it. Where do you start?

Do you contact the website? But it could be circulating on other platforms. Even if you could force some websites to take it down – and that’s a big if – anybody could have downloaded it, shared it. It could be on tens of thousands of men’s computers. You feel dizzy.

You could call the police. Is it even a crime? Can you imagine showing these images to male officers you don’t know? You feel furious and then terrified and then furious again. The perpetrator could be anyone. What if it’s your ex? You think about colleagues and friends in a paranoid frenzy. How can you trust anyone? You think about the future and start to feel hopeless. This will always be out there.

No wonder the word used most by women I speak to who have been affected is ‘powerless’. While deepfake pornography is a new form of abuse, its underlying power dynamics are old. It’s not just about sexualising women; it’s about subjugating them. When deepfakes emerged a few years ago there was an obsessive focus on videos of famous women, but now the technology has evolved so anyone can produce them.

As such, it has already happened to far more women than you might realise, including teenagers. In June last year, one of the first major cases of mass deepfake pornography allegedly perpetrated by schoolboys emerged in the UK.

Staff from a private girls’ school alerted police to reports that deepfake images and videos were being circulated by pupils at a nearby private boys’ school, with around a dozen girls thought to be victims.

At the time of writing, investigations are ongoing, but no disciplinary action has been taken by the boys’ school. Despite the spiralling number of incidents – just last week, Children’s Commissioner Dame Rachel de Souza called for a ban on apps that produce deepfakes – from the conversations I’ve had with educators most schools aren’t even aware the technology exists. So, given the prevalence and devastating impact of deepfakes, what is being done about them? Almost nothing.

In most countries, creating and sharing non-consensual deepfake pornography remains legal. In the UK it wasn’t until the Online Safety Act of 2023 that laws were introduced. However, there was a loophole: it only criminalised the sharing, not the creating, of sexually explicit deepfakes.

Earlier this year, added legislation was proposed to make creation an offence too – though this is yet to be formalised – but perpetrators will not face jail unless the image is shared more widely.

It all feels too little too late.

I think of Holly Willoughby, who stepped down from This Morning after a 37-year-old man was jailed for plotting online, with others, to kidnap, rape and murder her. Police found a device at the man’s home containing deepfake pornographic images of her.

I believe if we do not take action to stem the tide of deepfake image abuse, in the coming years we will see more cases of offences like stalking and murder that involve some element of manipulated sexualised images. By making these technologies widely accessible, we give men a powerful delusion of ownership over the bodies of any women they choose, which will exacerbate the already dire levels of male violence against women.

When they happen, public conversations about deepfakes tend to focus on the risks of spreading misinformation, political manipulation, or business impact. These are important issues but research suggests 96 per cent of deepfakes are non-consensual pornography – of which 99 per cent feature women. Yet a Europol report on ‘law enforcement and the challenge of deepfakes’ uses the word ‘women’ once, and has only brief paragraphs on deepfake pornography in its 22 pages. 

Clearly, society perceives the harassment and abuse of women and girls as less of an existential threat than spreading political misinformation.

After all, women and girls have experienced abuse since the beginning of time, right? What difference does a little more make?

Adapted from The New Age of Sexism by Laura Bates (£20, Simon & Schuster) out on May 15. © Laura Bates 2025. To order a copy for £18 (offer valid to 17/05/25; UK P&P free on orders over £25) go to http://www.mailshop.co.uk/books or call 020 3176 2937.

Laura Bates is the founder of Everyday Sexism. 



Content Curated Originally From Here