-
AI models are appearing on adult-content sites like OnlyFans and Fanvue — sometimes with stolen images.
-
And some people are selling courses for $220 on how to make your own lucrative AI adult creator.
-
Does AI harm adult creators? And do subscribers even know they’re talking to a computer?
Last winter, there were a few news items about how AI might be replacing humans in a surprising job: online influencer. The articles said a crop of new Instagram influencers had amassed large followings and even secured brand deals. There was one catch: The influencers were AI.
Some of these AI influencers, like Lil Miquela, are a sort of artsy commentary on the nature of influencing or something conceptually interesting. But when I looked a little further into one of the AI-generated influencer accounts on Instagram — one that had reportedly gotten some brand deals — I found a different type of story.
One of the most popular AI influencers had a link in her bio to a profile on Fanvue, an OnlyFans competitor. On her Fanvue account, the influencer posted proactive photos — and for a $7-a-month subscription, I could see her nude photos. (I feel strange saying “she” and “nude” because this person doesn’t exist. Remember: She’s AI. But this is where we are in 2024, I suppose.)
Ah, so I get it now: The business was always pornography — Instagram and other social media were just at the top of the conversion funnel. These accounts weren’t trying to become “Instagram influencers” who made money through promoting shampoo — they were using Instagram to drive traffic to Fanvue, where they could get men to pay to see their nude photos.
Once potential customers get to the paysites, they encounter more AI-generated pictures and videos.
The tech news site 404 Media just published a deep dive into this world, “Inside the Booming ‘AI Pimping’ Industry.” What reporters found was an astounding amount of AI-fueled accounts on both OnlyFans and Fanvue. Disturbingly, 404 Media found a number of these accounts used images that weren’t purely dreamed up by AI. Some were deepfakes — fake images of real people — or were face swaps, using someone’s face on an AI-generated body.
There is also a whole side economy of people selling guides and courses on how others can set up their own businesses to create AI models. One person is selling a course for $220 on how to make money with AI adult influencers.
A Fanvue spokesperson told Business Insider that using images that steal someone’s identity is against its rules. Fanvue also uses a third-party moderation tool and has human moderators. The spokesperson also said that “deepfakes are an industry challenge.” OnlyFans’ terms of service prohibit models from using AI chatbots. Those terms also say that AI content is allowed only if users can tell it’s AI and only if that content features the verified creator — not someone else.
Potentially stolen images aside, the existence of AI adult content is somewhat fraught. On one hand, some of these AI creators claim that this is not unlike cartoon pornography. But real-life adult content creators have concerns about AI affecting their business. Some told Business Insider’s Marta Biino recently that they find AI tools useful — like AI chatbots they use to talk to fans. But they said they also worried that using AI could erode fans’ trust.
I’m not sure that the fans of the AI accounts are always aware that these “people” are artificial intelligence. Comments on one obviously AI-generated woman’s account read like a lot of people think she’s human. On her Fanvue, the AI-generated woman sometimes posts pink-haired anime cartoon versions of herself.
On one of these posts, a paying Fanvue customer wrote that he wanted to see the outfit on the real woman — not an anime version. I’m not sure that he knows neither one is real.
Read the original article on Business Insider