Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

michael_vassar comments on Knowing About Biases Can Hurt People - Less Wrong

70 Post author: Eliezer_Yudkowsky 04 April 2007 06:01PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (79)

Sort By: Old

You are viewing a single comment's thread.

Comment author: michael_vassar 04 April 2007 07:17:04PM 24 points [-]

Humans aren't just not perfect Bayesians. Very very few of us are even Bayesian wannabes. In essence, everyone who thinks that it is more moral/ethical to hold some proposition than to hold it's converse is taking some criterion other than appearent truth as normative with respect to the evaluation of beliefs.

Comment author: DSimon 30 September 2010 03:32:36PM *  8 points [-]

This is something of a nitpick, but I think that it is more moral/ethical to hold a proposition than to hold its converse if there is good reason to think that that proposition is true. Is this un-Bayesian?

Comment author: robertskmiles 04 December 2011 01:03:37PM *  9 points [-]

It's a meta-level/aliasing sort of problem, I think. You don't believe it's more ethical/moral to believe any specific proposition, you believe it's more ethical/moral to believe 'the proposition most likely to be true', which is a variable which can be filled with whatever proposition the situation suggests, so it's a different class of thing. Effectively it's equivalent to 'taking apparent truth as normative', so I'd call it the only position of that format that is Bayesian.

Comment author: christopherj 14 October 2013 02:30:58AM 3 points [-]

This website seems to have two definitions of rationality: rationality as truth-finding, and rationality as goal-achieving. Since truth deals with "is", and morality deals with "ought", morality will be of the latter kind. Because they are two different definitions, at some point they can be at odds -- but what if your primary goal is truth-finding (which might be required by your statement if you make no exceptions for beneficial self-deception)? How would you feel about ignoring some truths, because they might lead you to miss other truths?

This article is about how learning some truths can prevent you from learning other truths, with an implication that order of learning will mitigate these effects. In some cases, you might be well served by purging truths from your mind (for example, "there is a miniscule possibility of X" will activate priming and availability heuristic). Some truths are simply much more useful than others, so what do you do if some lesser truths can get in the way of greater truths?

Comment author: Nornagest 14 October 2013 02:55:59AM *  1 point [-]

Neither truth-finding nor goal-achieving quite captures the usual sense of the word around here. I'd say the latter is closer to how we usually use it, in that we're interested in fulfilling human values; but explicit, surface-level goals don't always further deep values, and in fact can be actively counterproductive thanks to bias or partial or asymmetrical information.

Almost everyone who thinks they terminally value truth-finding is wrong; it makes a good applause light, but our minds just aren't built that way. But since there are so many cognitive and informational obstacles in our way, finding the real truth is at some point going to be critically important to fulfilling almost any real-world set of human values.

On the other hand, I don't rule out beneficial self-deception in some situations, either. It shouldn't be necessary for any kind of hypothetical rationalist super-being, but there aren't too many of those running around.

Comment author: datadataeverywhere 30 September 2010 05:00:30PM -2 points [-]

This seems like a shorthand for denying the existence of morals and ethics. I don't think that's what you mean, but I've heard that exact argument used to support nihilism.

If I say "torture is unethical", I might mean "I believe that torture, for its own sake and without a greater positive offset, is unethical", which is objectively true (please, I entreat you to examine my source code). But it would be just as objectively true to say the negation if I actually believed the negation. Is it neither moral nor immoral to hold the belief that torture is a bad thing?