Nick_Tarleton comments on Does Your Morality Care What You Think? - Less Wrong

12 Post author: Eliezer_Yudkowsky 26 July 2008 12:25AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (26)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Nick_Tarleton 26 July 2008 08:00:07PM 2 points [-]

I have to agree with PK and Ben; there's a heck of a lot more pressure for minds to converge on 2+3=5 than on any ethical statement. A mind that believes 2+3=6 will make wrong predictions about reality; a mind that has 'wrong' 'beliefs' about murder won't. (George has a point about game theory, but that's different from regarding someone else's death as terminally undesirable.) "The Platonic computation I implement judges murder as undesirable regardless of what anybody thinks" isn't the same as "murder is wrong regardless of what anybody thinks". I could define 'wrong' according to the output of my computation, but such an agent-relative definition would be silly.

...unless most humans converge to the same terminal values, in which case we could sensibly define "wrong" as the output of the computation implemented by humanity. There, it adds up to normality.

...well, kind of. That definition won't do by itself for moral arguments - it'd be like the calculator that computes "what does this calculator compute as the result of 2 + 3?" - any answer is correct. Some actual content is still needed.