lukeprog comments on In Defense of Moral Investigation - Less Wrong

-5 Post author: MTGandP 04 November 2012 04:26AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (78)

You are viewing a single comment's thread.

Comment author: lukeprog 03 November 2012 10:19:36PM 13 points [-]

Any new information about reality, if properly understood... can only cause people to become more ethical

Whether this is true depends on your definition of "ethical." In any case, your claim here doesn't weigh against the idea "that certain claims about the nature of reality could cause people to become more immoral" because people do not, in fact, always "properly understand" new information about reality.

Eliezer did say "Doing worse with more knowledge means you are doing something very wrong," but check what he said in the very next paragraph: "On the other hand, if you are only half-a-rationalist, you can easily do worse with more knowledge." The trouble is that current people are indeed only half-rational, or worse.

A particular truth can only hurt someone if he holds a false belief.

A counterexample: Suppose that a human-level AI, Ralph, holds only true beliefs. But Ralph doesn't yet know that Petunia exists. The superintelligent Omega tortures everyone who knows that Petunia exists. Now, Ralph learns that Petunia exists. But this truth hurts him, even though he doesn't hold a false belief.

Comment author: Larks 06 November 2012 06:50:20PM 4 points [-]

We have even better counterexamples against

Any new information about reality, if properly understood (that part is important), can only cause people to become more ethical.

Suppose Alice is a hates tall people with a passion. Then, she learns about a gathering of vulnerable tall people. She properly understands all the relevant consequences of this fact. Including that now she can act on her hatred!

Comment author: CarlShulman 04 November 2012 03:26:21AM 4 points [-]

Was there any need for AIs in the example?

Comment author: lukeprog 04 November 2012 04:08:45AM *  8 points [-]

I did it for clarity. I'm not sure what a human with only true beliefs looks like, or whether "a human with only true beliefs" will be a sensible phrase after we leave folk psychology behind.

Comment author: Eugine_Nier 04 November 2012 07:50:40PM *  2 points [-]

Eliezer did say "Doing worse with more knowledge means you are doing something very wrong," but check what he said in the very next paragraph: "On the other hand, if you are only half-a-rationalist, you can easily do worse with more knowledge."

The solution to this problem is do something Eliezer always argues against for some reason, namely, to compartmentalize.

Also it's likely that becoming more rational won't necessarily help the reasons you mentioned here.

Comment author: BarbaraB 05 January 2014 09:45:04AM 0 points [-]

The solution to this problem is do something Eliezer always argues against for some reason, namely, to compartmentalize.

Does Eliezer really do that ? I got the impression, than compartmentalization is, at least at some cases, perceived as a functional model to avoid big mistakes. (Just as You suggest).