CarlShulman comments on In Defense of Moral Investigation - Less Wrong

-5 Post author: MTGandP 04 November 2012 04:26AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (78)

You are viewing a single comment's thread. Show more comments above.

Comment author: lukeprog 03 November 2012 10:19:36PM 13 points [-]

Any new information about reality, if properly understood... can only cause people to become more ethical

Whether this is true depends on your definition of "ethical." In any case, your claim here doesn't weigh against the idea "that certain claims about the nature of reality could cause people to become more immoral" because people do not, in fact, always "properly understand" new information about reality.

Eliezer did say "Doing worse with more knowledge means you are doing something very wrong," but check what he said in the very next paragraph: "On the other hand, if you are only half-a-rationalist, you can easily do worse with more knowledge." The trouble is that current people are indeed only half-rational, or worse.

A particular truth can only hurt someone if he holds a false belief.

A counterexample: Suppose that a human-level AI, Ralph, holds only true beliefs. But Ralph doesn't yet know that Petunia exists. The superintelligent Omega tortures everyone who knows that Petunia exists. Now, Ralph learns that Petunia exists. But this truth hurts him, even though he doesn't hold a false belief.

Comment author: CarlShulman 04 November 2012 03:26:21AM 4 points [-]

Was there any need for AIs in the example?

Comment author: lukeprog 04 November 2012 04:08:45AM *  8 points [-]

I did it for clarity. I'm not sure what a human with only true beliefs looks like, or whether "a human with only true beliefs" will be a sensible phrase after we leave folk psychology behind.