Moss_Piglet comments on A Voting Puzzle, Some Political Science, and a Nerd Failure Mode - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (180)
Well, the whole point of instrumental rationality is that you need a correct map of reality (ie, to care about the truth) in order to be able to reach your goals whatever they are.
There is a strong signaling issue, appearing to be a conservative Christian can give a lot of political/social benefits in some circles, and it's easier to appear being one when you truly are one, but apart from that, having the belief of conservative Christian leads you to acts that are inefficient for reaching your goals, from rejecting your gay grandson to wasting time in prayer to not going to cryonics because you believe in afterlife.
Having a flawed map of a city means you'll not reach your goal efficiently (but either completely miss it, or use much more time/resources to finally reach it), and that's true whatever your goal is. The same is true with a flawed map of reality and navigating your life.
Even if you do not care about the truth for its own sake (if curiosity and preference for truth aren't in your terminal values), if you're intelligent, you should care about the truth as an instrumental value to reach whatever goal you truly have.
This is a good point, and holds in the majority of cases, although there are other considerations which should also be mentioned.
Since all maps are 'flawed' by definition, an important question is whether the flaws in your map actually interact with your goals, and if they do whether they are beneficial or harmful. It's usually not a good use of your energy to fine tune areas of your map which don't have any impact on your life and actively wasteful to "fix" them in ways which make it harder to achieve your goals.
Incorrect beliefs can be useful in the aggregate even if they fail in certain situations, as long as those situations are rare or inconsequential enough. I can be utterly wrong in my belief that there are no tigers in New York City (there are several in the Bronx Zoo, not to mention that more might well be kept illegally as pets) but it's completely orthogonal to my daily life and thus not important enough to spend effort investigating. And if I had a pathological fear of tigers, I would gain a pretty significant advantage from that same false belief; I would do well to maintain it even if presented with genuine counter-evidence.
I think that most religions are wrong to harmful degrees, but it's not an ironclad rule of rationality that beliefs must be maximally accurate. A pessimist is actually more accurate in their assessments of people, but optimists are happier and more successful; if your rationality insists you cannot be optimistic, then it is not useful and should be ignored.