Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
OpenAI Is 'Exploring' How to Responsibly Generate AI Porn (wired.com)
16 points by belter on May 9, 2024 | hide | past | favorite | 30 comments


> may include erotica, extreme gore, slurs, and unsolicited profanity

Why on Earth is porn grouped in with gore and slurs?!

What is with America's bizarre pretence that sex is something dirty? So dirty in fact, that it's in a category with swearing at people and slurs? And I say "pretence", because _my god_ you produce a lot of porn for a country that supposedly against that sort of thing.


Isn't the context of what you're quoting to do with the label "NSFW"? That label certainly has issues, but to be fair it's more of a "I don't want this on my screen at work" thing than a "this is dirty" thing. And from what I understand, that attitude isn't limited to America.


> > may include erotica, extreme gore, slurs, and unsolicited profanity > > Why on Earth is porn grouped in with gore and slurs?!

Because some fetishes involve one or more?


I'm really surprised they'd even say the 'p' word. It's truly a massive can of worms. Seems smarter to just nuke it completely and hope someone else tackles that mess.

Incorrectly handled, it could really topple of the open AI empire.


When the Christian right kept losing court cases over various laws based on christian moral values (examples: censorship laws and religious content in public school education), they figured out that they could just seek to influence every aspect of US society they could rather than fight court battles trying to make everything they don't like illegal. So they took up the "Seven Mountain Mandate." That includes banking and credit card processing.

Credit card companies and banks adopting Christian values into policy are why we see all the moralistic rules in the app stores, and why numerous websites have had to strictly curtail or outright remove 'adult' content.

It's also why porn actors and others in the industry are "unbankable" - many banks will close the accounts of someone they find out is active in the 'adult entertainment industry'.


> Credit card companies and banks adopting Christian values into policy are why we see all the moralistic rules in the app stores, and why numerous websites have had to strictly curtail or outright remove 'adult' content.

My understanding is that the proximate cause is that such services are more vulnerable to chargebacks and refunds, etc. than other industries.

My current speculation is that the low moral status with which the industry is viewed, as well as lack of existing protections makes its patrons more easily rationalize not paying, or being a bad customer, when engaging such services.


It reminds me of rumors. Unverified information heard or received from another. How is this any different? Videos, photos, and sounds need to be verified, too. I remember back then when you looked at a photo and you could tell with 99% certainty it is legitimate though, but regardless, the deepfake videos do not have to be believed... or hearsay distrusted by people. We might just start defaulting to disbelief.


Given that porn is a fantasy anyway, I’m not really sure it matters much. People will consume a fantasy.


Make the generation realtime VR and Meta Quest might have its killer app!


This is the killer app elephant in the room. Removing the exploitation of people in porn is a net positive. There’s been attempts at ethical porn but it is a garden in a rainforest.


Funny thing that you mention exploitation. From a high-level perspective, there's a lot of parallels in the discourse surrounding exploitation of women in pornography and exploitation of artists in generative AI. Mind you, I said that the discourse has parallels, we talk about these two things in similar ways -- I'll refrain from speaking on behalf of the actual actors whose experiences are distinct from my own (e.g.: artists, women, neural networks).

It's a pithy thought, but exactly what do I mean by that? Well... both things fundamentally exist within the interplay between labor, content, and the internet. We're forced to grapple with the yawning gap between abstract and economic value which the fundamental reproducibility of internet content implictly creates, not to mention the resulting potential for exploitation. This conflict forces us to reflect on unpleasant worldly realities -- how responsible is a consumer with finite resources for the fair treatment of those whose labors produce an infinite commodity?


Given that the AI in question is trained on the products of human exploitation (as you put it), is it accurate to say it’s been removed? The exploitation is laundered here and not truly removed, to my eyes.

(Intuitively it’s even worse in some ways, since performing in adult media doesn’t necessarily imply that the performer wants their physical attributes merged and re-represented in ways they would not have consented to.)


Yes. Because that person is no longer physically at risk. Now whether they consent to their image being used is a different type of exploitations that needs to be addressed. Both can be true. As I put it is correct.


> Because that person is no longer physically at risk.

I don't know if I would describe someone whose image is subject to arbitrary manipulation per anybody's fantasies as "no longer physically at risk." There's a nontrivial amount of precedent for mentally unwell people building up parasocial relationships that they then (violently) act on; it's not inconceivable that AI would facilitate those.

But even beyond that: you're approaching this from a baseline assumption is everyone who makes adult content is "physically at risk," which probably isn't true. The reality is that it's a spectrum, and that there is probably a large demographic that isn't harmed by the content they voluntarily produce, but would be harmed by content produced from a model trained on their data.

As an intuitive example: someone who does boudoir photography or burlesque is clearly not inviting an AI to produce "harder" content using their likeness, and would be harmed by that.


I agree with you (I also think gaming has allot of potential with this stuff), but personally I wouldn't want to feed my preferences for this sort of content into a hosted service.

I'd also imagine that there needs to some consideration applied in a legal context. Just yesterday there was a massive child porn bust which also included AI generated content[0].

[0] https://toronto.ctvnews.ca/ontario-provincial-police-arrest-...


Why "exploitation of people" in porn is unethical? Are the porn actors forced to work?


In theory? Nothing. In practice? A few things coalesce to make the situation more complicated:

1. Sex is a fundamentally asymmetrical power dynamic. For a conventional porn actress, it's frequently something that's done to you. These sex workers become passive actors in a transaction that directly taxes their mind & body, something very rare in other industries.

2. Sex work comes with a hefty opportunity cost. Any reputation you create for yourself in the sex industry becomes toxic once you leave -- and eventually you do have to leave it. It's a job that consumes all of your prime working & education years in exchange for little, if any, future opportunity.

3. Sex work is often, by far, the most lucrative job available to a young woman. Some young women simply have no other realistic choice to pay the bills. This in and of itself isn't a problem unique to sex work, however it does uniquely suck for sex workers because of the prior two issues. How do you stand up for yourself when the other party knows you can't walk away? How do you save for the future if you need the money today?


Yes. Eastern Europe and Southeast Asia including India. (And America.) Basically everywhere. Funny you think that all porn is consensual. Not “funny,” I meant “sad.”


Please, make me think otherwise. With numbers and proofs


No. I don't waste time on people like who you really don't care, and just want to fight.


Might be referring to nonconsensual cases, like "dumb high school kids make AI porn of their classmates"


Is it a common case? Seems like the vast portion of porn is made by professional actors, isn't it?


There are tons of stories of people being forced doing acts they did not sign for, and plenty of unsavory or illegal practices in the industry. Is all porn unethical? No, of course not, but there is still enough that you have a very serious chance of stumbling upon "not-ethically sourced" porn on a regular basis.


They're not going to touch that one.


Well, that's exactly what I mean. They should make sure their tools cannot be used to do that!


Sounds like Microsoft have decided on their direction for the next Xbox.


Why does it need to be “responsible”? I feel these words are intentionally misleading about what they are - morality codes based on a small number of people’s opinions.

I can’t read the linked article (paywall) but others on the same topic say that OpenAI would like to explore supporting whatever is not illegal, but that deepfakes are out of the question. Are deepfakes illegal and should they really be? Deepfakes are no different from what goes on in people’s heads. Should the law really restrict that expression? I can see restricting passing off deepfakes as real to embarrass someone, but we already have laws for things like defamation. And if it isn’t illegal but just expression, should OpenAI support it?


> morality codes based on a small number of people’s opinions.

In this case specifically, the "small number of people" happen to be "the people who work at OpenAI".


It’s not all the people at OpenAI. It’s a few people on a “safety” (euphemism) team in turn responding to outside pressure from a few loud people.


> It’s a few people on a “safety” (euphemism) team in turn responding to outside pressure from a few loud people.

It's weird you don't think that someone in OpenAI would care how their software is used, that it's due to outside influence.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: