A video appeared on Instagram a few weeks ago: a granddaughter describing how her grandfather spent decades hand-stitching leather bags in a small workshop, his life’s work now available online for a limited time. The imagery was warm, the narration emotional, the backstory complete. It was entirely fabricated.
ABC News identified dozens of similar operations across TikTok and YouTube, each using generative AI to manufacture founders, fake factory footage and synthetic brand narratives to move low-quality imported goods at premium prices.
Generative AI has collapsed what it once took to build consumer trust online. A direct-to-consumer brand used to need a real founder, original photography and operational credibility to justify charging $80 for a candle or $200 for a bag. Scores of companies now use AI to portray themselves as struggling small businesses, generating fake images and videos of craftsmen who don’t exist. That finished product can be assembled in hours.
The Trust Factory
The playbook follows a formula. Some operations use AI to make emotional appeals—one purportedly New York-based clothing retailer shared an AI-generated image of a damaged storefront with shattered glass and police tape to announce a “big sale.” Others simulate artisanship.
What makes these operations work isn’t production quality. It’s timing. ABC News noted that by the time consumers leave reviews or file complaints, the sites often go offline or move on to selling another product. The gap between launch and exposure is the margin.
Social platforms amplify the risk. These fraudulent sites thrive on social media, where consumers are often distracted and more likely to make a quick purchase. The scroll-and-tap dynamic that drives social commerce removes the scrutiny a buyer might apply elsewhere. The FTC reported that Americans lost $2.1 billion to scams originating on social media in 2025, an eightfold increase since 2020. The agency noted that most scams go unreported, putting the real total higher.
Advertisement: Scroll to Continue
Platforms Caught Between Speed and Safety
The problem has forced marketplaces into a familiar position: moving fast enough to stay competitive while running detection systems capable of catching identities that never existed. Allure Security reported that TikTok rejected more than 1.4 million seller applications, blocked 70 million products before listing and removed roughly 700,000 sellers for policy violations in the first half of 2025. The company’s head of global governance called generative AI a tool for organized fraud networks operating at scale.
The numbers show platforms moving, but not fast enough. PYMNTS Intelligence found that 52% of businesses have deployed new AI models for fraud detection, with retailers using adaptive machine learning to reduce false positives by up to 85% while doubling compromised card detection. Only 37% use generative AI for fraud protection, even as 72% anticipate AI-driven fraud to be their top challenge by 2026.
The Verification Gap
Detection at the content layer is one piece. The harder problem sits at merchant onboarding.
According to Visa, cybercriminals are using generative AI to create synthetic identities, deepfake videos and forged digital documents that bypass traditional verification methods. A fabricated founder with a plausible backstory, a registered domain and polished AI-generated product videos can clear onboarding checks built for a different threat model.
For payment platforms, the question is no longer just whether a transaction is fraudulent. It’s whether the merchant behind it is real. AI-generated synthetic identities combine real and fabricated information in ways that let fraudsters bypass traditional verification systems.
FTC received 3 million fraud reports in 2025, with total losses reaching $15.9 billion, up from $12.5 billion the prior year. Impersonation scams ranked as the most reported category. The agency is scheduled to release updated guidance on AI-generated deception later this year.
For all PYMNTS AI coverage, subscribe to the daily AI Newsletter.






