They’re Selling Nudes of Imaginary Women on Reddit — and It’s Working
“F/19 feeling pretty today,” Claudia’s post reads. She’s got straight black bangs and giant blue-green eyes, with just the socially appropriate amount of cleavage sticking out of her grey tank top. With her alabaster skin, delicate features, and vaguely indie hairstyle, she looks exactly like someone the average Redditor would obsess over, and indeed, the comments on Claudia’s post on the subreddit r/faces are all variations of, “hot” and “you’re very gorgeous.” Except for one.
“For those who aren’t aware, I’m going to kill your fantasy,” the comment reads. “This is literally an AI creation, if you’ve worked with AI image models and making your own long enough, you can 10000% tell. Sorry to ruin the surprise, I guess.”
Claudia is, indeed, an AI-generated creation, who has posted her (AI-generated) lewd photos on other subreddits, including r/normalnudes and r/amihot. She’s the brainchild of two computer science students who tell Rolling Stone they essentially made up the account as a joke, after coming across a post on Reddit from a guy who made $500 catfishing users with photos of real women. They made about $100 selling her nudes until other redditors called out the account, though they continue to post lewds on other subreddits.
“You could say this whole account is just a test to see if you can fool people with AI pictures,” says the team behind Claudia, who declined to disclose their real names. “You could compare it to the vtubers, they create their own characters and play as an entirely different person. We honestly didn’t think it would get this much traction.”
Claudia was created by Stable Diffusion, an AI program that uses machine learning to generate shockingly realistic-looking photos using nothing but a text prompt. (In this case, the text prompt was a selfie of a woman in her house “without makeup with black hair, shoulder length hair, simple background, straight hair, hair bangs.”) Her post on r/faces prompted a firestorm of users reporting the post, leading a moderator for the group, who asked not to be named, to post a disclaimer clarifying that AI-generated photos are not against the subreddit’s rules. “I take a caveat emptor approach with these things,” the moderator tells Rolling Stone. “If people think it is real and want to do something outside of my subreddit, that is on them.”
Claudia is among the first, but by no means the last, fictional adult content creator to be generated via rapidly evolving AI technology, prompting a slew of ethical questions and concerns. Most discussions about the dangers posed by AI and adult content have focused on the prevalence of deepfakes, a term used to describe an image or video that uses a person’s face without their consent. According to Sensity, an AI firm, nearly 96 percent of all deepfakes are pornographic in nature and feature a woman’s face being used without their consent. Though many platforms, like Reddit, ostensibly have policies preventing the proliferation of deep fakes, such content is fairly easy to find online, with some Discord communities selling deepfake porn of “personal girls” — meaning non-celebrities — for as little as $5 a pop, according to an NBC News report.
But while the ethical implications of deepfake porn are fairly unambiguous, the potential threat posed by AI porn featuring fictional characters is different, says Hany Farid, a professor of computer science at University of California, Berkeley. “I think deepfake porn should be illegal and is really problematic. That is black and white and easy in some ways,” he says. “On this stuff, it’s more complicated.”
Even though adult-content creations like Claudia are ostensibly fictional and made up whole-cloth, it is true that the billions of images used to train AI models are of real people, most of whom have probably not consented to having their likenesses used in this way. This, in itself, raises ethical issues, says Farid, as there is a possibility (albeit a very small one) that these AI models can reproduce images they were trained on. “I don’t think it’s likely, and I don’t think it happens often, but it can happen that someone can have a Facebook image, and their likeness ends up in nonconsensual imagery,” he says. “And the fact that it can happen, is important.”
There’s also the fact that even though a handful of technologically savvy Redditors were able to identify Claudia as an AI creation, she wasn’t explicitly labeled as such, which could potentially create issues for consumers. The creators behind Claudia say the onus should be on the consumer to discern whether a model is real or synthetic: “We don’t really know how to think about this, the person buying the pictures is happy with their purchase since they are looking for porn,” they say. “We never really said those are pictures of a real person, and buyers never really asked. They just assumed.”
The authentication issue could also potentially pose an issue for platforms, which rely on often elaborate verification processes to ensure that creators who join their platform are not underage or being exploited (hence, why content creators like Claudia are selling nudes in unregulated Reddit DMs, and not on sites with more stringent verification policies). While there are some human creators already on the platform OnlyFans who use AI to enhance their content in unique ways — such as HarperTheFox, who uses the technology to generate, say, cyberpunk-inspired nudes — allowing non-human creators on the platform could potentially create major issues, says Ashley, a sex-worker peer organizer who works on tech platform issues. “The idea behind all this verification is making sure no one is being exploited. There’s no person here, so no one can confirm consent,” she says, adding that platforms like OnlyFans would likely not allow AI creators on the platform for that reason. (OnlyFans did not immediately return a request for comment.)
Perhaps more to the point, the question of whether AI adult creators could one day supplant real ones fundamentally misunderstands the appeal of websites like OnlyFans, which are predicated on the idea that a subscriber is directly interacting with a model, says Ashley. She says, in the short term, the technology behind AI is simply not developed enough yet to pose a real threat to the adult marketplace: “The capabilities are still overblown. I haven’t seen anything to indicate that it could generate a video that looks like a person having sex.”
In the long term, however, though Ashley could see a market for AI-generated erotic art of, say, beloved cartoon characters, she thinks there will always be a demand for content that shows real people having sex, simply because authenticity can be a powerful selling point for all types of material goods, not just porn. She envisions a world in which flesh-and-blood adult content is valued at an even higher premium, sort of like fair-trade organic coffee or handmade knitwear.
“It’s not uncommon for things to be branded as bespoke, handmade, with stuff that’s competing with fast fashion and fast food,” she says. “People do pay a premium for hand crafted content of any sort. That could be a marketing point people connect with.”
Those who create AI porn, however, frame the future popularity of models like Claudia not as a hypothetical, but as an inevitability. “AI porn is going to only get more popular,” the computer scientists who made Claudia said in their last DM to Rolling Stone. “There will be more accounts like this.”
Update April 11, 2023, 12:04 p.m.: This post has been updated to include comment from the moderator for r/faces.