Eugine_Nier comments on In Defense of Moral Investigation - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (78)
Whether this is true depends on your definition of "ethical." In any case, your claim here doesn't weigh against the idea "that certain claims about the nature of reality could cause people to become more immoral" because people do not, in fact, always "properly understand" new information about reality.
Eliezer did say "Doing worse with more knowledge means you are doing something very wrong," but check what he said in the very next paragraph: "On the other hand, if you are only half-a-rationalist, you can easily do worse with more knowledge." The trouble is that current people are indeed only half-rational, or worse.
A counterexample: Suppose that a human-level AI, Ralph, holds only true beliefs. But Ralph doesn't yet know that Petunia exists. The superintelligent Omega tortures everyone who knows that Petunia exists. Now, Ralph learns that Petunia exists. But this truth hurts him, even though he doesn't hold a false belief.
The solution to this problem is do something Eliezer always argues against for some reason, namely, to compartmentalize.
Also it's likely that becoming more rational won't necessarily help the reasons you mentioned here.
Does Eliezer really do that ? I got the impression, than compartmentalization is, at least at some cases, perceived as a functional model to avoid big mistakes. (Just as You suggest).