We talk a lot about how the internet is filling up with content generated by artificial intelligence. And, of course, that includes the sort guaranteed to generate clicks and dollars: the adult variety. Platforms like Instagram have seen an explosion in sexy AI-generated influencers, and the people running those accounts sometimes steal content from real creators and mash them up with AI.
The practice is called “AI pimping,” Jason Koebler, co-founder of 404 Media, told Marketplace’s Meghan McCarty Carino. The following is an edited transcript of their conversation.
Jason Koebler: There was this AI influencer named Emily Pellegrini, and a few months ago the person behind the Emily Pellegrini account pivoted that entire account to be about how to make an AI influencer. And so they sell a class, he calls himself Professor EP, but it’s something like $250. And you have all these people talking about strategies for creating these influencers, how to do it, how to monetize it. And there’s, I mean, almost all of the AI influencers that we saw are female personas, and many of the people operating these accounts are men. And we know that just because they told us that that’s the case.
Meghan McCarty Carino: So it sounds like a whole kind of cottage industry has grown up to do this. How do these AI pimps actually make money from this?
Koebler: Almost all of them have — they’re not OnlyFans, but they’re OnlyFans competitors. And so there’s a few different websites that more or less allow you to subscribe to a creator and pay them a monthly fee or to buy imagery directly from them. And so essentially, they’re trying to build a really big following on Instagram, and then they push to these AI-generated sites where you can subscribe to their content, and very often it will be sexual content on these pay-subscription sites. And some of these guides even say you’re not necessarily in the pornography business, you are in the, quote, unquote, “loneliness business,” where you’re essentially selling a fantasy to people that you are a real person. And there’s a person behind it, but there’s also all these sorts of strategies for, like, how to talk to potential customers, scripts to use. A lot of them also have bots that automatically talk to potential customers. And so almost everything that you can possibly imagine is automated to the extent possible.
McCarty Carino: How is this affecting actual human adult content creators?
Koebler: Yeah, I mean, first and foremost, I think that there’s the dehumanization aspect of it. We’ve spoken to some people about this, and it’s really alarming to see a video of yourself with a different AI-generated face on it. We know that Instagram, over the last few months has, in its algorithm, kind of promoted Reels as something, so it’s very popular for people to take Reels that have previously gone viral from either adult actresses or just standard influencers and to swap the faces of the AI onto these videos. And that’s really alarming. The other thing is that Meta has not been terribly welcoming to adult performers. They have banned a lot of these accounts. They sort of exist in this state of precarity.
And so we talked to one adult performer named Elena St. James, and she told us that she wants to go and report this stuff to Meta, but by doing that she sort of invites scrutiny to her own account, and so she’s worried about doing that. And so a lot of the people that we spoke to said that they’re worried about reporting this stuff to Meta. And I have talked to Meta. Meta commented for this story and sort of said, “Well, we have rules against impersonation.” But often it’s not sort of clear who they are impersonating, because the AI-generated persona is usually not a person that exists. And it can be the case that one AI influencer might be face swapping their face onto the bodies of five or six different women, and so it’s not even that the body is consistent across time. It’s really dystopian in that way.
McCarty Carino: So are platforms doing anything to mitigate this kind of activity?
Koebler: That’s a great question. My personal feeling is that Meta is really dropping the ball here. Some of these accounts are verified by Instagram. Many of them have tens of thousands or hundreds of thousands of followers. Many of them are ripping off the content of other verified Instagram models and influencers. And I think that if Meta wants to have a platform that feels healthy and feels human, it needs to be a lot more strict about what it is allowing in this space particularly, allowing AI to impersonate humans, allowing AI influencers to sort of take over.
We have seen a few accounts that are really egregious that have been taken down, but we spoke to one researcher who has been tracking these accounts, and he found 900 of them. And then he said, “I stopped counting, because I could have easily found 900 more in another day.” And that’s been our experience as well. We’ve been reporting on the story for months, and at some point we had a spreadsheet of accounts and we stopped keeping track because there were so many. And like anything else on social media, it’s like once you interact with one or follow one, or scroll on one, it’s just like you’re recommended 10 more. And so it really is a rabbit hole here.
McCarty Carino: You write that the implications of this phenomenon go beyond the adult content realm. They might kind of give us a preview of what the whole internet is going to look like. Tell me more about what you mean by that.
Koebler: I would classify this into the broader category of AI slop, which is not a term I created, but is sort of just like low-effort content that has flooded every corner of the internet. It’s like, pick a platform and there’s some version of this. I’ve been writing extensively about AI-generated photos that are not influencers, but just random photos on Facebook that have gone viral over and over and over again. And a lot of these platforms are kind of in a tricky position, because many of them are investing in generative AI technology. Like, Meta has its own AI chatbots and its own AI-image generation services, and so they’re very reticent to take a hard-line stance against AI-generated content.
And I think that because it’s so easy to make so much of this, it’s like, I don’t know — and one of the specific appeals in these guides to making this says, “a human needs to travel to Bali to take a photo, like, showing a lifestyle of, ‘Hey, I’m in Bali,’ but an AI influencer can make 1,000 different photos of themselves in Bali or in some tropical destination in a few seconds.” And in terms of sheer scale, human beings really just can’t compete. And I think that the algorithmification of all of our feeds — what the people who are spamming this stuff are doing, they’re finding kind of like weaknesses in the algorithm. And they are finding things that work, stuffing keywords into hashtags into their post, getting this stuff boosted in the algorithm, collecting either small advertising revenue or trying to get people to subscribe to their thing. And then when Meta or another platform cracks down on it, they’re just pivoting to another type of AI-generated content. And so I do think it’s taking over every platform. I think discoverability on the internet is becoming very difficult, and it’s getting really hard to sort of wade through the AI-generated muck, for lack of a better term.