1 min read

2

This is a special post for quick takes by Xylitol. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
9 comments, sorted by Click to highlight new comments since:

Can we model almost all money choices in our life as ethical offsetting problems?

Example 1: You do not give money to a homeless person on the street, or to a friend who's struggling financially and maybe doesn't show the best sense when it comes to money management. You give the money you save to a homeless shelter or to politicians promoting basic income or housing programs.

Example 2: You buy cheaper clothes from a company that probably treats its workers worse than other companies. You give the money you save to some organization that promotes ethical global supply chains or gives direct money aid to people in poverty.

(Note: In all these examples, you might choose to give the money to some organization that you believe has some larger net positive than the direct offset organization. So you might not give money to homeless people, and instead give it to Against Malaria Foundation, etc. This is a modification of the offsetting problem that ignores questions of fungibility of well-being among possible benefactors.)

The argument for: In the long term, you might promote systems that prevent these problems from happening in the first place.

The argument against: For example 1, social cohesion. You might suck as a friend, might get a reputation for sucking as a friend, and you might feel less safe in your community knowing that if everyone acted the same way as you, you wouldn't get support. For example 2, the market mechanism might just be better—maybe you should vote directly with your money? It's fuzzy, though, since paying less money to companies that pay horribly may just drive down pay more? Some studies on this would be helpful.

Critical caveat: Are you actually shuttling the money you're saving by doing the thing that's probably negative into the thing that's more probably positive? It's very easy to do the bad thing, say you're going to do the good thing, and then forget to do the good thing or otherwise rationalize it away.

I think the critical caveat dominates here.  Most of the time, it's not "give to a homeless street beggar vs donate additional to a shelter", but "give now vs an undefined use of the money later, probably less charitable than you like to think".

[+][comment deleted]00

How long will it take until high-fidelity, AI-generated porn becomes an effective substitute for person-generated porn?

Here are some important factors: Is it ethical? Is it legal? Does the output look genuine? Is it cost-effective?

Possible benefits:

  • More Privacy. If significant markets still exist for porn images, the images taken of porn actors will be used for data rather than as-is, which means that their identity can be protected from the consumer.
  • More Boutique Offerings. If massive volumes of fairly derivative AI-generated pornography can be created basically for free, this may also drive demand for highly produced, well-compensated, commissioned pornography. Think handcrafted Etsy goods in the current age of globalized mass production, or DeviantArt commissioned hentai drawings, or OnlyFans.
  • More Variety, Lower Cost. From a consumer perspective, AI generation opens up infinite horizons and lowers the barrier to entry.
  • Less Human Trafficking. If the market splits into mass-produced AI-generated porn and boutique offerings from a select number of actors, this may reduce demand for people to do run-of-the-mill porn shoots.

Problems to look out for:

  • Illegal Material. Large crawls with little oversight will almost certainly pick up images of child pornography, violence, etc. This will need to be cleaned.
  • Copyright. These models will use tons of source images. How do you work out copyright and payment for use? This problem is similar to what Github's CoPilot is going through right now.
  • Less Compensation. With more competition from AI generation, many porn actors who aren't popular enough for dedicated followings may be compensated less for their work.
  • More Human Trafficking. Maybe the demand for training images is so high, and the normal rate of compensation for taking those images is pushed so low that it's undesirable for mainstream porn actors, that this increases human trafficking?

A really unpleasant case:

  • Illegal Porn Without Illegal Training Data. What if a model is created that can create child porn or snuff porn or other horrific things without using the corresponding material for training data? The analogue here is that drawn or 3d-modeled hentai showing this kind of material is not considered illegal in many countries—will this hold for photorealistic content?

Why is the really unpleasant case really unpleasant? While I'm not particularly interested in seeking out such material, I'm not averse to the possibility of its mere existence.

It might be unpleasant if it were forced in my face in the form of ads, though.

There could be knock-on effects of increasing demand for non-AI-generated analogues, increasing harm.

(Formato edited) There could also be effects of decreasing demand for non-AI-generated analogues, because of potential consumers of this kind of content being satisfied with these virtual, AI-generated, no-one-was-harmed analogues, hence reducing harm. I can see how sex with real children leads to moral condemnation and to legal punishment. But if no real child is ever involved in this it seems to me that it's an instance of "disgust leads to moral condemnation leads to legal punishment / prohibition of the material".

You can surround text with two asterisks (*) on each side to bold text, at least in the Markdown editor. With the rich-text editor, you can just click on the bold button.

Fixed. Thank you.