Lukas_Gloor comments on Moral Anti-Epistemology - Less Wrong

0 Post author: Lukas_Gloor 24 April 2015 03:30AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (36)

You are viewing a single comment's thread. Show more comments above.

Comment author: OrphanWilde 02 May 2015 05:57:10PM 1 point [-]

And if making people more informed in this manner makes them worse off?

Comment author: Lukas_Gloor 02 May 2015 11:29:35PM *  0 points [-]

The sad thing is it probably will (the rationalist's burden: aspiring to be more rational makes rationalizating harder, and you can't just tweak your moral map and your map of the just world/universe to fit your desired (self-)image).

What is it that counts, revealed preferences or stated preferences or preferences that are somehow idealized (if the person knew more, was smarter etc.)? I'm not sure the last option can be pinned down in a non-arbitrary way. This would leave us with revealed preferences and stated preferences, even though stated preferences are often contradictory or incomplete. It would be confused to think that one type of preferences are correct, whereas the others aren't. There are simply different things going on, and you may choose to focus on one or the other. Personally I don't intrinsically care about making people more agenty, but I care about it instrumentally, because it turns out that making people more agenty often increases their (revealed) concern for reducing suffering.

What does this make of the claim under discussion, that deontology could sometimes/often be a form of moral rationalizing? The point still stands, but it is qualified with a caveat, namely that it is only a rationalizing if we are talking about (informed/complete) stated preferences. For whatever that's worth. On LW, I assume it is worth a lot to most people, but there's no mistake being made if it isn't for someone.