These men allegedly profit from teaching people how to produce AI-powered porn


A little over A year ago, MG was living a relatively normal life in her 20s in Scottsdale, Arizona. She worked as a personal assistant and supplemented her income by waiting tables on weekends. Like most women her age, she had an Instagram account, where she would occasionally post stories and photos of herself drinking matcha tea, lounging by the pool with her friends, or going to Pilates.

“I never cared about coming out and becoming famous on social media,” says MG (who is cited only as MG in the suit to protect her identity). “I used it the way most people used it when it first came out, to share their lives with the people closest to them.” She has just over 9,000 followers, which is a large following, but nowhere near a huge platform.

Last summer, she received a direct message from one of her followers. The person asked her, Did you know that photos and videos of a woman who looks exactly like MG are being circulated on Instagram? MG clicked on the link and saw several reels of what appeared to be her face superimposed on a body that looked exactly like hers. The woman in the photo was dressed semi-nude, with tattoos in the same places as MG.

MJ was horrified. “If you didn’t know me well, you might think they were pictures of me,” she said. “It was like this reality check, where I have no control over my image.”

She was even more dismayed to discover that not only had her manipulated nude or semi-nude photos been circulating online, as she explained in a recent complaint, but they had also been used to advertise AI ModelForge, a platform that teaches men how to create AI influencers. In a series of online lessons and tutorials, the men allegedly taught participants to use software called CreatorCore to train artificial intelligence models using photos of unsuspecting young women, and post the resulting content on Instagram and TikTok.

“They provided a full manual, including instructions on how to choose the right person so that it wasn’t someone who could defend themselves, so they all had instructions on what type of women to use and where to get their pictures,” she claims. “It was disgusting on every level.”

MG is one of three plaintiffs in a lawsuit filed in January in Arizona against three Phoenix men: Jackson Webb, Lucas Webb, and Beau Schultz, as well as 50 other John Doss. The lawsuit alleges that Webbs and Schultz searched the Internet for photos of unexpected young women, then used artificial intelligence to create photos and videos of fictional models who looked exactly like them, and sold such content on the subscription platform Fanvue.

The suit also alleges that for $24.95 a month on the Whop platform, the men sold online courses to train other men, including John named in the suit, on how to create their own influencers through artificial intelligence based on photos of real women. The men allegedly created “blueprints” for how to extract images from women’s social media accounts and feed them into a generative AI model on CreatorCore, as well as a separate app that would remove women’s clothing and generate sexually explicit images and videos. The suit alleges that such content has generated millions of views and is said to have generated more than $50,000 in income in one month. (Webbs and Schultz did not respond to requests for comment.)

The complaint alleges that this money-making scheme exploits “a harem of indistinguishable AI clones of unsuspecting women and girls,” as well as directing “predators seeking to prey” on women on social media. According to the lawsuit, in 2025, the CreatorCore platform had more than 8,000 subscribers creating AI effects, resulting in more than 500,000 photos and videos.

Leave a Reply