Development
Women sue the men who used their Instagram feed to create AI porn influencers
May 1, 2026 Development Source: Ars Technica
Share this article
“They provided a whole playbook, including instructions on how to pick the right person so that it’s not someone who can defend themselves, so they all had instructions on what type of women to use and where to get their pictures,” she claims. “It was disgusting on every single level.”
MG is one of three plaintiffs in a lawsuit filed in January in Arizona against three Phoenix men: Jackson Webb, Lucas Webb, and Beau Schultz, as well as 50 other John Does. The lawsuit alleges that the Webbs and Schultz scoured the Internet for photos of unsuspecting young women, then used AI to generate photos and videos of fictional models who look exactly like them, selling such content on the subscription platform Fanvue.
The suit further alleges that for $24.95 a month on the platform Whop, the men sold courses online training other men, including the John Does named in the suit, how to make their own AI-generated influencers based on real women’s photos. The men allegedly created “Blueprints” for how to scrape images from women’s social media accounts and feed them into the generative AI model on CreatorCore, as well as a separate app that would remove the women’s clothes and generate sexually explicit images and videos. Such content, the suit claims, generated millions of views, reportedly generating more than $50,000 in income in one month. (The Webbs and Schultz did not respond to requests for comment.)
This moneymaking scheme, the complaint alleges, preyed on a “harem of indistinguishable AI copies of unsuspecting women and girls,” as well as instructing “predators seeking to prey on” women on social media. According to the suit, in 2025 the CreatorCore platform had more than 8,000 subscribers generating their own AI influencers, resulting in more than 500,000 images and videos.
Earlier this year, Kupper introduced a bill in the Arizona Legislature requiring websites to use automated detection tools, such as age verification or consent forms, to prevent nonconsensual AI content from being uploaded. “Once something’s online, it’s pretty much there forever, even though victims spend millions of dollars trying to take it down. It’s like whack-a-mole—you hit one, another one pops up.”
Currently, if you visit the Linktree page for AI ModelForge, it directs you to what appears to be the same business rebranded as “TaviraLabs,” a Telegram group with more than 18,000 members that advertises itself as “the #1 AI Influencer coaching community.” Additionally, the suit names more than a dozen Instagram accounts used by the defendants to promote AI ModelForge, most of which are still active. The suit details how such accounts continue to post photos of nubile women, fast cars, and expensive watches, writing captions such as, “She’s not my girlfriend, she’s my best paid employee” and “POV: You built her in 20 minutes and she made you $13.2k in the first 45 days.”
Even though MG and the other plaintiffs have continually lobbied Instagram to take their images down, many of them are still up, she claims, because they do not technically violate Instagram’s guidelines surrounding AI-generated content. When reached for comment, a spokesperson for Instagram said it had “extremely strict policies” around both AI- and non-AI-generated nonconsensual intimate imagery, removing accounts that post such content. When provided with a list of a dozen or so accounts thought to be associated with AI ModelForge, the spokesperson said the accounts were under review.
The suit also cites a number of TikTok accounts promoting the men’s business. When reached for comment, a TikTok spokesperson said the accounts were found to violate community guidelines and have been taken down.
MG says the images generated by AI Model Forge are distinct enough from her own photos that she frustratingly has been unable to claim that the accounts are impersonating hers, which is also a violation of Instagram guidelines. “It’s my face, my tattoos, on a different outfit on a slightly different body,” she says. “These are real women being transformed, not just a random AI-generated person.”
Though MG lives in constant fear of people in her lives seeing the pornographic AI-generated images of her, she says filing suit has given her a bit of her agency back. “We were put in this place where our backs were against the wall and I want other women to know you can’t stop living your life,” she says.
Still, what happened to MG, a woman with fewer than 10,000 followers, has daunting implications for virtually anyone with a remotely public online presence.
“It’s not about being cautious with your image online because everyone posts on social media now,” she says. “Everyone is on LinkedIn. Everyone is on Instagram. And I want people to realize that this could also happen to them.”
This story originally appeared on wired.com.