“Nudifier” apps that use artificial intelligence (AI) to create explicit deepfakes from ordinary photographs have fuelled an industry valued at up to $36 million (£27 million), a report has claimed.
Big Tech companies such as Google, Cloudflare, Amazon and PayPal have also been providing services to those developing the apps, researchers said.
Nudification apps use AI to create naked images of women, usually without their consent, in a process campaigners have branded “digital sexual assault”. Their use has become widespread, especially among children, and they can be used to blackmail or extort victims.

ClothOff produces an average of 200,000 pictures a day, according to Der Spiegel
Baroness Bertin’s government commissioned review of the pornography industry as well as the Children’s Commissioner and the Internet Watch Foundation have called for them to be banned.
Researchers from the Indicator website looked at 85 sites offering nudifiers and found that Amazon and Cloudflare provided hosting or content delivery services for 62 of them. Google enabled sign-on services for 53 of the sites, some of which were able to use PayPal, Coinbase, Mercuryo and Telegram for payments either directly or through an intermediary.
The investigation estimated that, based on web traffic, 18 of the sites made between $2.6 million and $18.4 million over the six months to May this year.
Indicator found that search engines were driving a significant share of traffic to the websites. Search terms such as “deepnude AI” and “AI clothes remover” returned either a nudifier or a positive review of a nudifier in the top ten links of Google Search.

ClothOff has a budget of €3 million and 30 employees, according to a whistleblower

One of the most popular nudifier services is called ClothOff. It produces an average of 200,000 pictures a day, according to Der Spiegel. The service, which is run by four people from Russia, Belarus and Ukraine, has a budget of €3 million and 30 employees, a whistleblower told the German newspaper.
The most-visited site investigated by Indicator was Undress CC, which receives about 1.1 million of its 2.8 million monthly unique visitors from referrals by other websites. Undress offers a 40 per cent revenue share with the referrer and claims its affiliates have made more than $2.6 million. The website had an Instagram account dedicated to its affiliate programme that was aimed at recruiting adult performers and content creators. Meta, which owns Instagram, has since removed the account.
Santiago Lakatos, one of the Indicator researchers, called for the tech companies providing support to the nudifiers to “step up to the problem”. He added: “What I’d like them to do is treat this like the problem that it is. It’s not just, ‘Oh, this is porn’ … There’s mountains of evidence about how these are used to target children.”
• Naga Munchetty: I found fake nudes of myself online
An Internet Matters report from October last year showed that the majority of teenagers (55 per cent) believed it would be worse to have a deepfake nude of them created and shared than a real nude image. This was driven by the lack of autonomy and awareness of the image and the fear that family members, teachers or peers might believe it was real.
Jess Asato, the Labour MP who chairs the all-party parliamentary group on perpetrators of domestic abuse, said nudifiers “digitally strip women and girls”. She added: “I think this is a tool that facilitates digital sexual assault and for that reason I have been calling on the government to just ban this content.”

Jess Asato is calling for a ban on the tools
A government source said: “Sharing nude images of someone without their consent is horrific and upsetting and rightly illegal. This government has also made it an offence to create sexually explicit deepfakes. Nudification apps have no legitimate purpose and outlawing them is the next step in tackling violence against women and girls online.”
Google said: “To use sign-in with Google, developers must agree to our terms of service, which prohibit illegal content as well as content which harasses others. Some of these sites violate our terms and our teams are taking action to address these violations, as well as working on longer-term solutions.
• Yvette Cooper: tech firms have gone backwards on child abuse
“While search engines allow people to access sites that are available on the web, we’ve launched and continue to develop ranking protections that limit the visibility of harmful, non-consensual explicit content by promoting high-quality information when available.”
A spokesman for Amazon Web Services (AWS) said: “AWS has clear terms that require our customers to use our services in compliance with applicable laws. When we receive reports of potential violations of our terms, we act quickly to review and take steps to disable prohibited content. If anyone suspects that AWS resources are being used for abusive activity, they can report it to AWS Trust & Safety using the report abuse form.”
The other companies mentioned were approached for comment.






