Eliezer_Yudkowsky comments on Open Thread: July 2010, Part 2 - Less Wrong

6 Post author: Alicorn 09 July 2010 06:54AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (770)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 21 July 2010 09:50:27AM 2 points [-]

I don't consider frogs to be objects of moral worth.

Comment author: multifoliaterose 21 July 2010 10:19:39AM 8 points [-]

Why not?

Comment author: RichardKennaway 21 July 2010 11:40:38AM 2 points [-]

Seconded, and how do you (Eliezer) rate other creatures on the Great Chain of Being?

Comment author: VNKKET 21 July 2010 09:28:45PM *  12 points [-]

Are there any possible facts that would make you consider frogs objects of moral worth if you found out they were true?

(Edited for clarity.)

Comment author: Eliezer_Yudkowsky 22 July 2010 09:04:11PM 5 points [-]

"Frogs have subjective experience" is the biggy, there's a number of other things I already know myself to be confused about which impact on that, and so I don't know exactly what I should be looking for in the frog that would make me think it had a sense of its own existence. Certainly there are any number of news items I could receive about the frog's mental abilities, brain complexity, type of algorithmic processing, ability to reflect on its own thought processes, etcetera, which would make me think it was more likely that the frog was what a non-confused person of myself would regard as fulfilling the predicate I currently call "capable of experiencing pain", as opposed to being a more complicated version of neural network reinforcement-learning algorithms that I have no qualms about running on a computer.

A simple example would be if frogs could recognize dots painted on them when seeing themselves in mirrors, or if frogs showed signs of being able to learn very simple grammar like "jump blue box". (If all human beings were being cryonically suspended I would start agitating for the chimpanzees.)

Comment author: DanielVarga 25 July 2010 07:14:12AM 3 points [-]

I am very surprised that you suggest that "having subjective experience" is a yes/no thing. I thought it is consensus opinion here that it is not. I am not sure about others on LW, but I would even go three steps further: it is not even a strict ordering of things. It is not even a partial ordering of things. I believe it can be only defined in the context of an Observer and an Object, where Observer gives some amount of weight to the theory that Object's subjective experience is similar to Observer's own.

Comment author: Blueberry 29 July 2010 12:15:57AM 1 point [-]

I thought it is consensus opinion here that it is not.

Links? I'd be interested in seeing what people on LW thought about this, if it's been discussed before. I can understand the yes/no position, or the idea that there's a blurry line somewhere between thermostats and humans, but I don't understand what you mean about the Observer and Object. The Observer in your example has subjective experience?

Comment author: Utilitarian 23 July 2010 08:19:37AM 3 points [-]

I like the way you phrased your concern for "subjective experience" -- those are the types of characteristics I care about as well.

But I'm curious: What does ability to learn simple grammar have to do with subjective experience?

Comment author: XiXiDu 22 July 2010 10:35:46AM *  11 points [-]

Questions of priority - and the relative intensity of suffering between members of different species - need to be distinguished from the question of whether other sentient beings have moral status at all. I guess that was what shocked me about Eliezer's bald assertion that frogs have no moral status. After all, humans may be less sentient than frogs compared to our posthuman successors. So it's unsettling to think that posthumans might give simple-minded humans the same level of moral consideration that Elizeer accords frogs.

-- David Pearce via Facebook

Comment author: CarlShulman 21 July 2010 06:34:51PM 10 points [-]

I'm surprised. Do you mean you wouldn't trade off a dust speck in your eye (in some post-singularity future where x-risk is settled one way or another) to avert the torture of a billion frogs, or of some noticeable portion of all frogs? If we plotted your attitudes to progressively more intelligent entities, where's the discontinuity or discontinuities?

Comment author: Bongo 24 July 2010 11:50:35AM 2 points [-]

Hopefully he still thinks there's a small probability of frogs being able to experience pain, so that the expected suffering of frog torture would be hugely greater than a dust speck.

Comment author: Vladimir_Nesov 21 July 2010 07:25:54PM *  2 points [-]

You'd need to change that to 10^6 specks and 10^15 frogs or something, because emotional reaction to choosing to kill the frogs is also part of the consequences of the decision, and this particular consequence might have moral value that outweighs one speck.

Your emotional reaction to a decision about human lives is irrelevant, the lives in question hold most of the moral worth, while with a decision to kill billions of cockroaches (to be safe from the question of moral worth of frogs), the lives of the cockroaches are irrelevant, while your emotional reaction holds most of moral worth.

Comment author: Utilitarian 21 July 2010 11:26:32PM 3 points [-]

the lives of the cockroaches are irrelevant

I'm not so sure. I'm no expert on the subject, but I suspect cockroaches may have moderately rich emotional lives.

Comment author: Blueberry 21 July 2010 07:36:19PM 1 point [-]

Do you mean you wouldn't trade off a dust speck in your eye (in some post-singularity future where x-risk is settled one way or another) to avert the torture of a billion frogs, or of some noticeable portion of all frogs?

Depends. Would that make it harder to get frog legs?

Comment author: RichardKennaway 21 July 2010 01:13:56PM 2 points [-]

Would you save a stranded frog, though?

Comment author: cousin_it 21 July 2010 12:36:07PM *  1 point [-]

Yeah, trying to save the world does that to you.

ETA (May 2012): wow, I can't understand what prompted me to write a comment like this. Sorry.

Comment author: Rain 21 July 2010 12:48:15PM *  1 point [-]

Axiom: The world is worth saving.
Fact: Frogs are part of the world.
Inference: Frogs are worth saving in proportion to their measure and effect on the world.
Query: Is life worth living if all you do is save more of it?

Comment author: cousin_it 21 July 2010 12:51:48PM *  3 points [-]

I don't know. I'm not Eliezer. I'd save the frogs because it's fun, not because of some theory.

Comment author: RichardKennaway 21 July 2010 01:13:34PM 1 point [-]

Is life worth living if all you do is save more of it?

As a matter of practical human psychology, no. People cannot just give and give and get nothing back from it but self-generated warm fuzzies, a score kept in your head by rules of your own that no-one else knows or cares about. You can do some of that, but if that's all you do, you just get drained and burned out.

Comment author: JoshuaZ 21 July 2010 01:27:29PM 0 points [-]

Axiom: The world is worth saving. Fact: Frogs are part of the world. Inference: Frogs are worth saving in proportion to their measure and effect on the world.

Three does not follow from 1. It doesn't follow that the world is more likely to be saved if I save frogs. It also doesn't follow that saving frogs is the most efficient use of my time if I'm going to spend time saving the world. I could for example use that time to help reduce existential risk factors for everyone, which would happen to incidentally reduce the risk to frogs.

Comment author: Rain 21 July 2010 01:53:03PM *  3 points [-]

I find it difficult to explain, but know that I disagree with you. The world is worth saving precisely because of the components that make it up, including frogs. Three does follow from 1, unless you have a (fairly large) list of properties or objects in the world that you've deemed out of scope (not worth saving independently of the entire world). Do you have such a list, even implicitly? I might agree that frogs are out of scope, as that was one component of my motivation for posting this thread.

And stating that there are "more efficient" ways of saving frogs than directly saving frogs does not refute the initial inference that frogs are worth saving in proportion to their measure and effect on the world. Perhaps you are really saying "their proportion and measure is low enough as to make it not worth the time to stoop and pick them up"? Which I might also agree with.

But in my latest query, I was trying to point out that "a safe Singularity is a more efficient means of achieving goal X" or "a well thought out existential risk reduction project is a more efficient means of saving Y" can be used as a fully general counterargument, and I was wondering if people really believe they trump all other actions one might take.

Comment author: Utilitarian 21 July 2010 05:56:53PM *  12 points [-]

I'm surprised by Eliezer's stance. At the very least, it seems the pain endured by the frogs is terrible, no? For just one reference on the subject, see, e.g., KL Machin, "Amphibian pain and analgesia," Journal of Zoo and Wildlife Medicine, 1999.

Rain, your dilemma reminds me of my own struggles regarding saving worms in the rain. While stepping on individual worms to put them out of their misery is arguably not the most efficient means to prevent worm suffering, as a practical matter, I think it's probably an activity worth doing, because it builds the psychological habit of exerting effort to break from one's routine of personal comfort and self-maintenance in order to reduce the pain of other creatures. It's easy to say, "Oh, that's not the most cost-effective use of my time," but it can become too easy to say that all the time to the extent that one never ends up doing anything. Once you start doing something to help, and get in the habit of expending some effort to reduce suffering, it may actually be easier psychologically to take the efficiency of your work to the next level. ("If saving worms is good, then working toward technology to help all kinds of suffering wild animals is even better. So let me do that instead.")

The above point applies primarily to those who find themselves devoting less effort to charitable projects than they could. For people who already come close to burning themselves out by their dedication to efficient causes, taking on additional burdens to reduce just a bit more suffering is probably not a good idea.

Comment author: Blueberry 21 July 2010 07:40:47PM -1 points [-]

At the very least, it seems the pain endured by the frogs is terrible, no?

Maybe so, but the question is why we should care.

While stepping on individual worms to put them out of their misery is arguably not the most efficient means to prevent worm suffering, as a practical matter, I think it's probably an activity worth doing

If only for the cheap signaling value.

Comment author: Utilitarian 21 July 2010 11:17:07PM *  3 points [-]

If only for the cheap signaling value.

My point was that the action may have psychological value for oneself, as a way of getting in the habit of taking concrete steps to reduce suffering -- habits that can grow into more efficient strategies later on. One could call this "signaling to oneself," I suppose, but my point was that it might have value in the absence of being seen by others. (This is over and above the value to the worm itself, which is surely not unimportant.)

Comment author: CronoDAS 22 July 2010 04:25:38PM 1 point [-]

What about dogs?