You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Eliezer_Yudkowsky comments on Open thread, August 19-25, 2013 - Less Wrong Discussion

2 Post author: David_Gerard 19 August 2013 06:58AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (325)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 21 August 2013 07:55:00PM 6 points [-]

I believe that I care nothing for nematodes, and that as the nervous systems at hand became incrementally more complicated, I would eventually reach a sharp boundary wherein my degree of caring went from 0 to tiny. Or rather, I currently suspect that an idealized version of my morality would output such.

Comment author: ahbwramc 22 August 2013 11:28:20PM 5 points [-]

I'm kind of curious as to why you wouldn't expect a continuous, gradual shift in caring. Wouldn't mind design space (which I would imagine your caring to be a function of) be continuous?

Comment author: Eliezer_Yudkowsky 23 August 2013 12:58:16AM 7 points [-]

Something going from 0 to 10^-20 is behaving pretty close to continuously in one sense. It is clear that there are some configurations of matter I don't care about at all (like a paperclip), while I do care about other configurations (like twelve-year-old human children), so it is elementary that at some point my utility function must go from 0 to nonzero. The derivative, the second derivative, or even the function itself could easily be discontinuous at this point.

Comment author: MugaSofer 23 August 2013 03:57:07PM *  -1 points [-]

It is clear that there are some configurations of matter I don't care about at all (like a paperclip), while I do care about other configurations (like twelve-year-old human children), so it is elementary that at some point my utility function must go from 0 to nonzero.

And ... it isn't clear that there are some configurations you care for ... a bit? Sparrows being tortured and so on? You don't care more about dogs than insects and more for chimpanzees than dogs?

(I mean, most cultures have a Great Chain Of Being or whatever, so surely I haven't gone dreadfully awry in my introspection ...)

Comment author: Eliezer_Yudkowsky 23 August 2013 06:46:54PM 3 points [-]

This is not incompatible with what I just said. It goes from 0 to tiny somewhere, not from 0 to 12-year-old.

Comment author: shminux 23 August 2013 06:59:24PM 0 points [-]

Can you bracket this boundary reasonably sharply? Say, mosquito: no, butterfly: yes?

Comment author: Eliezer_Yudkowsky 23 August 2013 08:34:30PM 10 points [-]

No, but I strongly suspect that all Earthly life without frontal cortex would be regarded by my idealized morals as a more complicated paperclip. There may be exceptions and I have heard rumors that octopi pass the mirror test, and I will not be eating any octopus meat until that is resolved, because even in a world where I eat meat because optimizing my diet is more important and my civilization lets me get away with it, I do not eat anything that recognizes itself in a mirror. So a spider is a definite no, a chimpanzee is an extremely probable yes, a day-old human infant is an extremely probable no but there are non-sentience-related causes for me to care in this case, and pigs I am genuinely unsure of.

Comment author: Eliezer_Yudkowsky 24 August 2013 12:54:16AM 6 points [-]

To be clear, I am unsure if pigs are objects of value, which incorporates both empirical uncertainty about their degree of reflectivity, philosophical uncertainty about the precise relation of reflectivity to degrees of consciousness, and ethical uncertainty about how much my idealized morals would care about various degrees of consciousness to the extent I can imagine that coherently. I can imagine that there's a sharp line of sentience which humans are over and pigs are under, and imagine that my idealized caring would drop to immediately zero for anything under the line, but my subjective probability for both of these being simultaneously true is under 50% though they are not independent.

However it is plausible to me that I would care exactly zero about a pig getting a dust speck in the eye... or not.

Comment author: Emile 25 August 2013 12:11:03PM *  1 point [-]

I do not eat anything that recognizes itself in a mirror.

Assuming pigs were objects of value, would that make it morally wrong to eat them? Unlike octopi, most pigs exist because humans plan on eating them, so if a lot of humans stopped eating pigs, there would be less pigs, and the life of the average pig might not be much better.

(this is not a rhetorical question)

Comment author: Eliezer_Yudkowsky 25 August 2013 07:16:08PM 2 points [-]

Yes. If pigs were objects of value, it would be morally wrong to eat them, and indeed the moral thing to do would be to not create them.

Comment author: drethelin 25 August 2013 09:05:59PM 2 points [-]

I don't think it's morally wrong to eat people, if they happen to be in irrecoverable states

Comment author: Vladimir_Nesov 25 August 2013 08:48:14PM *  2 points [-]

This needs a distinction between the value of creating pigs, existence of living pigs, and killing of pigs. If existing pigs are objects of value, but the negative value of killing them (of the event itself, not of the change in value between a living pig and a dead one) doesn't outweigh the value of their preceding existence, then creating and killing as many pigs as possible has positive value (relative to noise; with opportunity cost the value is probably negative, there are better things to do with the same resources; by the same token, post-FAI the value of "classical" human lives is also negative, as it'll be possible to make significant improvements).

Comment author: fubarobfusco 25 August 2013 11:37:53PM 0 points [-]

Does it matter to you that octopuses are quite commonly cannibalistic?

Comment author: Eliezer_Yudkowsky 26 August 2013 01:05:31AM 5 points [-]

No. Babyeater lives are still important.

Comment author: shminux 26 August 2013 03:11:48AM 1 point [-]

I was unable to empathize with this view when reading 3WC. To me the Prime Directive approach makes more sense. I was willing to accept that the Superhappies have an anti-suffering moral imperative, since they are aliens with their alien morals, but that all the humans on the IPW or even its bridge officers would be unanimous in their resolute desire to end suffering of the Babyeater children strained my suspension of disbelief more than no one accidentally or intentionally making an accurate measurement of the star drive constant.

Comment author: MugaSofer 26 August 2013 05:18:02PM 0 points [-]

Funny, I parsed that as "should we then maybe be capturing them all to stop them eating each other?"

Didn't even occur to me that was an argument about extrapolated octopus values.

Comment author: Bakkot 24 August 2013 06:48:08PM 1 point [-]

The derivative, the second derivative, or even the function itself could easily be discontinuous at this point.

But needn't be! See for example f(x) = exp(-1/x) (x > 0), 0 (x ≤ 0).

Wikipedia has an analysis.

(Of course, the space of objects isn't exactly isomorphic to the real line, but it's still a neat example.)

Comment author: Eliezer_Yudkowsky 24 August 2013 07:11:47PM 1 point [-]

Agreed, but it is not obvious to me that my utility function needs to be differentiable at that point.

Comment author: Armok_GoB 27 August 2013 08:09:04PM 0 points [-]

I dispute that; the paperclip is almost certainly either more or less likely to become a Boltzmann brain than an equivalent volume of vacuum.

Comment author: David_Gerard 21 August 2013 10:22:52PM *  1 point [-]

But zero is not a probability.

Edit: Adele_L is right, I was confusing utilities and probabilities.

Comment author: Adele_L 22 August 2013 12:04:33AM 13 points [-]

Zero is a utility, and utilities can even be negative (i.e. if Eliezer hated nematodes).

Comment author: MugaSofer 23 August 2013 03:40:50PM 0 points [-]

... are you pointing out that there is a nonzero probability that Eliezer's CEV actually cares about nematodes?

Comment author: David_Gerard 24 August 2013 04:15:40PM 1 point [-]

No, Adele_L is right, I was confusing utilities and probabilities.

Comment author: MugaSofer 23 August 2013 03:40:07PM 0 points [-]

... really?

Um, that strikes me as very unlikely. Could you elaborate on your reasoning?

Comment author: Armok_GoB 27 August 2013 08:04:25PM *  0 points [-]

Keyword here is believe. What probability do you assign?

And if you say epsilon or something like that, is the epsilon bigger or smaller than 1/(3^^^3/10^100)?