I’ve been seeing a lot of TikTok’s recently about AI-powered systems, such as Stable Diffusion, a company that has the technology to translate text prompts into pictures. These pictures have gone on to be used by brands to create concept images and even full-blown marketing campaigns. It’s pretty wild stuff. I had a bash myself and typed in “horse drinking milk through a straw” and it threw out some pretty abstract imagery.
However, not all users explore technology in the same way, with some people testing the systems’ murkier side. In this case: porn. AI porn is highly flawed and more than a little unsettling and as the technology advances, it will raise extremely tough ethical issues for sex workers. Last week, what appears to be one of the first high-fidelity AI porn sites, called Porn Pen, went live.
Find out more on the subject of AI porn in this techcrunch article.
Users are able to customise the AI naked models by using tags like “lingerie model,” or “chubby.” You have also change the AI model’s ethnicity and choose the backdrop of where she’s standing. There are currently no male AI models to customise, it’s only women. Buttons can take pictures of the model from the front or rear, and users can also use the “film photo” or “mirror selfie” option.
A user claiming to be Porn Pen’s creator posted on a Hacker News forum and said the site is just an “experiment” that makes use of cutting-edge text-to-image models. They stated:
“I explicitly removed the ability to specify custom text to avoid harmful imagery from being generated. New tags will be added once the prompt-engineering algorithm is fine-tuned further.”
Regardless of being an “experiment” or not, Porn Pen raises a number of ethical issues. Could this be the demise of content creators? Ashley, a sex worker, thinks that the content generated by Porn Pen isn’t a threat to sex workers at the moment. She says:
“There is endless media out there, but people differentiate themselves not by just making the best media, but also by being an accessible, interesting person. It’s going to be a long time before AI can replace that.”
Porn sites such as OnlyFans and ManyVids, require content creators to verify their age but AI–generated porn models obviously can’t do this so Ashley’s main concern is that if porn sites crack down on AI porn, it could lead to harsher restrictions for sex workers and content creators. If you have an interested in self-creating AI assisted porn then check out createporn for more info.
Stable Diffusion is one example of a system that “learns” to create graphics from text. The systems learn that particular words and phrases correspond to particular art forms, aesthetics, locations, and other things after being fed billions of photographs. Let’s say you’ve typed in “black dog on balcony”, the system will create an arty black dog on some sort of balcony. However, it becomes a bit more tricky when the prompts make use of certain stereotypes or are particularly vague. The main issue some people seem to have is that those systems are only going to “represent those whose bodies are accepted and valued in mainstream society”.
This leads us to another issue in – deepfakes. We’ve all seen those faked nude images of celebs in uncompromising positions. While most pics are almost comical for how ridiculous the photoshopping is, there are some where it is almost impossible to tell the difference. Sensity AI, a research firm, discovered in 2019 that non-consensual porn was included in 96% of deepfake movies online. It’s not just celebrities who have had faked images of them created. Porn stars and Onlyfans models have also fallen victim to this kind of violation.
Theoretically, a pornstar might be able to take legal action against the creator of a deepfaked image using copyright defences, defamation claims, and even human rights laws. However, compiling proof to back up the legal claim can be a huge burden. A leading researcher had this to say as a closing statement:
“I think we’ll see a lot more people testing the limits of both the technology and society’s boundaries in the coming decade. We must accept some responsibility for this and work to educate people about the ramifications of what they are doing.”