You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

MugaSofer comments on Open thread, August 19-25, 2013 - Less Wrong Discussion

2 Post author: David_Gerard 19 August 2013 06:58AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (325)

You are viewing a single comment's thread. Show more comments above.

Comment author: MugaSofer 23 August 2013 03:57:07PM *  -1 points [-]

It is clear that there are some configurations of matter I don't care about at all (like a paperclip), while I do care about other configurations (like twelve-year-old human children), so it is elementary that at some point my utility function must go from 0 to nonzero.

And ... it isn't clear that there are some configurations you care for ... a bit? Sparrows being tortured and so on? You don't care more about dogs than insects and more for chimpanzees than dogs?

(I mean, most cultures have a Great Chain Of Being or whatever, so surely I haven't gone dreadfully awry in my introspection ...)

Comment author: Eliezer_Yudkowsky 23 August 2013 06:46:54PM 3 points [-]

This is not incompatible with what I just said. It goes from 0 to tiny somewhere, not from 0 to 12-year-old.

Comment author: shminux 23 August 2013 06:59:24PM 0 points [-]

Can you bracket this boundary reasonably sharply? Say, mosquito: no, butterfly: yes?

Comment author: Eliezer_Yudkowsky 23 August 2013 08:34:30PM 10 points [-]

No, but I strongly suspect that all Earthly life without frontal cortex would be regarded by my idealized morals as a more complicated paperclip. There may be exceptions and I have heard rumors that octopi pass the mirror test, and I will not be eating any octopus meat until that is resolved, because even in a world where I eat meat because optimizing my diet is more important and my civilization lets me get away with it, I do not eat anything that recognizes itself in a mirror. So a spider is a definite no, a chimpanzee is an extremely probable yes, a day-old human infant is an extremely probable no but there are non-sentience-related causes for me to care in this case, and pigs I am genuinely unsure of.

Comment author: Eliezer_Yudkowsky 24 August 2013 12:54:16AM 6 points [-]

To be clear, I am unsure if pigs are objects of value, which incorporates both empirical uncertainty about their degree of reflectivity, philosophical uncertainty about the precise relation of reflectivity to degrees of consciousness, and ethical uncertainty about how much my idealized morals would care about various degrees of consciousness to the extent I can imagine that coherently. I can imagine that there's a sharp line of sentience which humans are over and pigs are under, and imagine that my idealized caring would drop to immediately zero for anything under the line, but my subjective probability for both of these being simultaneously true is under 50% though they are not independent.

However it is plausible to me that I would care exactly zero about a pig getting a dust speck in the eye... or not.

Comment author: Emile 25 August 2013 12:11:03PM *  1 point [-]

I do not eat anything that recognizes itself in a mirror.

Assuming pigs were objects of value, would that make it morally wrong to eat them? Unlike octopi, most pigs exist because humans plan on eating them, so if a lot of humans stopped eating pigs, there would be less pigs, and the life of the average pig might not be much better.

(this is not a rhetorical question)

Comment author: Eliezer_Yudkowsky 25 August 2013 07:16:08PM 2 points [-]

Yes. If pigs were objects of value, it would be morally wrong to eat them, and indeed the moral thing to do would be to not create them.

Comment author: drethelin 25 August 2013 09:05:59PM 2 points [-]

I don't think it's morally wrong to eat people, if they happen to be in irrecoverable states

Comment author: Vladimir_Nesov 25 August 2013 08:48:14PM *  2 points [-]

This needs a distinction between the value of creating pigs, existence of living pigs, and killing of pigs. If existing pigs are objects of value, but the negative value of killing them (of the event itself, not of the change in value between a living pig and a dead one) doesn't outweigh the value of their preceding existence, then creating and killing as many pigs as possible has positive value (relative to noise; with opportunity cost the value is probably negative, there are better things to do with the same resources; by the same token, post-FAI the value of "classical" human lives is also negative, as it'll be possible to make significant improvements).

Comment author: fubarobfusco 25 August 2013 11:37:53PM 0 points [-]

Does it matter to you that octopuses are quite commonly cannibalistic?

Comment author: Eliezer_Yudkowsky 26 August 2013 01:05:31AM 5 points [-]

No. Babyeater lives are still important.

Comment author: shminux 26 August 2013 03:11:48AM 1 point [-]

I was unable to empathize with this view when reading 3WC. To me the Prime Directive approach makes more sense. I was willing to accept that the Superhappies have an anti-suffering moral imperative, since they are aliens with their alien morals, but that all the humans on the IPW or even its bridge officers would be unanimous in their resolute desire to end suffering of the Babyeater children strained my suspension of disbelief more than no one accidentally or intentionally making an accurate measurement of the star drive constant.

Comment author: Viliam_Bur 31 August 2013 12:10:07PM 1 point [-]

To me the Prime Directive approach makes more sense.

As an example outside of sci-fi, if you see an abusive husband and a brainwashed battered wife, the Prime Directive tells you to ignore the whole situation, because they both think it's more or less okay that way. Would you accept this consequence?

Would it make a moral difference if the husband and wife were members of a different culture; if they were humans living on a different planet; or if they belonged to a different sapient species?

Comment author: shminux 31 August 2013 06:52:26PM 0 points [-]

The idea behind the PD is that for foreign enough cultures

  • you can't predict the consequences of your intervention with a reasonable certainty,

  • you can't trust your moral instincts to guide you to do the "right" thing

  • the space of all favorable outcomes is likely much smaller than that of all possible outcomes, like in the literal genie case

  • so you end up acting like a UFAI more likely than not.

Hence non-intervention has a higher expected utility than an intervention based on your personal deontology or virtue ethics. This is not true for sufficiently well analyzed cases, like abuse in your own society. The farther you stray from the known territory, the more chances that your intervention will be a net negative. Human history is rife with examples of this.

So, unless you can do a full consequentialist analysis of applying your morals to an alien culture, keep the hell out.

Comment author: MugaSofer 26 August 2013 05:18:02PM 0 points [-]

Funny, I parsed that as "should we then maybe be capturing them all to stop them eating each other?"

Didn't even occur to me that was an argument about extrapolated octopus values.

Comment author: Eliezer_Yudkowsky 26 August 2013 07:38:42PM 2 points [-]

It wasn't, your first parse would be a correct moral implication. The Babyeaters must be stopped from eating themselves.

Comment author: MugaSofer 26 August 2013 09:36:18PM 0 points [-]

... whoops.

I meant I parsed fubarobfusco's comment differently to you, ("they want to be cannibals, therefore it's ... OK to eat them? Somehow?"), because I just assumed that obviously you should save the poor octopi (i.e. it would "bother" you in the sense of moral anguish, not "betcha didn't think of this!")