Kaj_Sotala comments on Open Thread, December 2-8, 2013 - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (183)
One piece of common wisdom on LW is that if you expect that receiving a piece of information will make you update your beliefs in a certain direction, you might as well update already instead of waiting. I happened to think of one exception: if you expect that something will cause a change in your beliefs when it shouldn't, because it uses strong rhetorical techniques (e.g. highlighting highly unrepresentative examples) whose effect you can't fully eliminate even when you know that they're there.
(I have a feeling that this might have been discussed before, but I don't remember where in that case.)
It's more like, if you expect (in the statistical sense) that you will rationally update your beliefs in some direction upon receiving some piece of evidence, then your current probability assignments are incoherent, and you should update on pain of irrationality. It's not just that you might as well update now instead of waiting. But this only applies if your expected future update is one that you rationally endorse. If you know that your future update will be irrational, that it is not going to be the appropriate response to the evidence presented, then your failure to update right now is not necessarily irrational. The proof of incoherence does not go through in this case.
This seems related, though not exactly what you are asking for.
There's an intermediate step of believing things because you expect them to be true (rather than merely convincing). It's fully corrected if you use correlates-to-truth over convincingness for the update.
In other words, if you expect the fifth column more if you see sabotage, and more if you don't see sabotage, then you can reduce that into just expecting the fifth column more.
This seems like a breakdown in reflective consistency. Shouldn't you try to actively counter/avoid the expected irrationality pressure, instead of (irrationally and meekly) waiting for it to nudge your mind in a wrong direction? Is there a specific example that prompted your comment? I can think of some cases offhand. Say, you work at a failing company and you are required to attend an all-hands pep talk by the CEO, who wants to keep the employee morale up. There are multiple ways to avoid being swayed by rhetoric: not listening, writing down possible arguments and counter arguments in advance, listing the likely biases and fallacies the speaker will play on and making a point of identifying and writing them down in real time, etc.
No specific examples originally, but Yvain had a nice discussion about persuasive crackpot theories in his old blog (now friends-locked, but I think that sharing the below excerpt is okay), which seems like a good example:
As for trying to actively counter the effect of the misleading rhetoric, one can certainly try, but they should also keep in mind that we're generally quite bad at this. E.g. while not exactly the same thing, this bit from Misinformation and its Correction seems relevant:
Sure, you should try to counter. But sometimes the costs of doing that are higher than the losses that will result from an incorrect belief.