A MAGA-aligned social media model who had built a sizable following among conservative audiences in the US has been exposed as an entirely AI-generated personality—created by a 22-year-old man in India looking to earn money online, according to a report by WIRED.
The creator, who went by the pseudonym Sam to avoid affecting his future medical career, relied on generative AI tools to construct every aspect of the persona. From the woman’s appearance to her posts and captions, everything about “Emily Hart” was fabricated. The account portrayed her as a blonde, pro-Donald Trump supporter with strong Christian views, even claiming she worked as a nurse. Visually, she resembled actor Jennifer Lawrence.
Sam, an aspiring orthopaedic surgeon, crafted the character to fit a very specific mould—posting images of her in bikinis, engaging in activities like ice fishing, drinking beer, and even posing with firearms. He said the idea began when he was struggling financially during his studies and trying to save money to eventually move to the US.
Initially, his goal was simple: make money online. He began by generating AI images of a young woman in a bikini and then turned to Google’s AI Gemini for advice on growing the account. The tool suggested focusing on a niche audience—particularly conservative, MAGA-leaning users, especially older men in the US, who tend to be more loyal and financially capable.
A man in India says he made thousands of dollars scamming ‘super dumb’ MAGA fans with a fake AI influencer named ‘Emily Hart’
• Helped pay for his medical school
• Instagram’s algorithm would blow up his fake content daily
• Tried a liberal version but it didn’t work… pic.twitter.com/u4ySJJXfgR
— Culture Crave 🍿 (@CultureCrave) April 21, 2026
Taking that advice, Sam tailored the content accordingly. The posts leaned heavily into themes like Christianity, gun rights, anti-abortion messaging, and anti-immigration rhetoric. One such post showed the character holding a gun with the caption: “If you want a reason to unfollow: Christ is king, abortion is murder, and all illegals must be deported” along with “POV: You were assigned intelligent at birth, but you identify as liberal.”
The strategy worked quickly. “Every reel I posted was getting 3 million views, 5 million views, 10 million views. The algorithm loved it. Within a month, Emily Hart had more than 10,000 Instagram followers, many of whom also subscribed to her softcore AI-generated content on the OnlyFans competitor Fanvue,” he told WIRED.
Sam said the effort required was minimal compared to the returns. “I was spending maybe 30 to 50 minutes of my day, and I was making good money for a medical student. In India, even in professional jobs, you can’t make this amount of money. I haven’t seen any easier way to make money online,” he said, adding that he was earning a few thousand dollars each month.
Story continues below this ad
He also used X’s Grok AI to produce more explicit images, which he uploaded to Fanvue—a platform where users pay for exclusive content and interaction with creators. “I was basically doing nothing. And it was just flooded with money,” he said.
The operation eventually came to an end when Hart’s Instagram account was taken down in February for fraudulent activity. After WIRED published its investigation, her Facebook profile was also removed.
Looking back, Sam said he likely would have stopped anyway, even without the ban. He maintains he doesn’t regret creating the persona. “I don’t feel like I was scamming people,” he said, arguing that subscribers were satisfied with the content. However, he has since moved on from creating AI influencers and plans to focus on his medical studies instead.
DISCLAIMER: This story highlights the rise of AI-generated personas on social media. While the content involves viral trends and financial themes, please note that these claims and strategies originate from unverified social media narratives and have not been independently verified. This article is for informational purposes and does not constitute professional career or financial advice.






