Will_Newsome comments on Compartmentalization in epistemic and instrumental rationality - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (121)
This is leaving out the danger that realistic assessments of your ability can be hazardous to your ability to actually perform. People who over-estimate their ability accomplish more than people who realistically estimate it, and Richard Wiseman's luck research shows that believing you're lucky will actually make it so.
I think instrumental rationalists should perhaps follow a modified Tarski litany, "If I live in a universe where believing X gets me Y, and I wish Y, then I wish to believe X". ;-)
Actually, more precisely: "If I live in a universe where anticipating X gets me Y, and I wish Y, then I wish to anticipate X, even if X will not really occur". I can far/symbolically "believe" that life is meaningless and I could be killed at any moment, but if I want to function in life, I'd darn well better not be emotionally anticipating that my life is meaningless now or that I'm actually about to be killed by random chance.
(Edit to add a practical example: a golfer envisions and attempts to anticipate every shot as if it were going to be a hole-in-one, even though most of them will not be... but in the process, achieves a better result than if s/he anticipated performing an average shot. Here, X is the perfect shot, and Y is the improved shot resulting from the visualization. The compartmentalization that must occur for this to work is that the "far" mind must not be allowed to break the golfer's concentration by pointing out that the envisioned shot is a lie, and that one should therefore not be feeling the associated feelings.)
(I did similarly with the Litany of Gendlin in my post):
I wrote a slightly less general version of the Litany of Gendlin on similar lines, based on the one specific case I know of where believing something can produce utility:
The last two lines may be truncated off for some values of X, but usually shouldn't be.