AnnaSalamon comments on Compartmentalization in epistemic and instrumental rationality - Less Wrong

77 Post author: AnnaSalamon 17 September 2010 07:02AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (121)

You are viewing a single comment's thread. Show more comments above.

Comment author: pjeby 17 September 2010 06:05:21PM *  6 points [-]

The key is simple: the downsides from de-compartmentalization stem from allowing a putative fact to overwrite other knowledge (e.g., letting one’s religious beliefs overwrite knowledge about how to successfully reason in biology, or letting a simplified ev. psych overwrite one's experiences of what dating behaviors work). So, the solution is to be really damn careful not to let new claims overwrite old data.

This is leaving out the danger that realistic assessments of your ability can be hazardous to your ability to actually perform. People who over-estimate their ability accomplish more than people who realistically estimate it, and Richard Wiseman's luck research shows that believing you're lucky will actually make it so.

I think instrumental rationalists should perhaps follow a modified Tarski litany, "If I live in a universe where believing X gets me Y, and I wish Y, then I wish to believe X". ;-)

Actually, more precisely: "If I live in a universe where anticipating X gets me Y, and I wish Y, then I wish to anticipate X, even if X will not really occur". I can far/symbolically "believe" that life is meaningless and I could be killed at any moment, but if I want to function in life, I'd darn well better not be emotionally anticipating that my life is meaningless now or that I'm actually about to be killed by random chance.

(Edit to add a practical example: a golfer envisions and attempts to anticipate every shot as if it were going to be a hole-in-one, even though most of them will not be... but in the process, achieves a better result than if s/he anticipated performing an average shot. Here, X is the perfect shot, and Y is the improved shot resulting from the visualization. The compartmentalization that must occur for this to work is that the "far" mind must not be allowed to break the golfer's concentration by pointing out that the envisioned shot is a lie, and that one should therefore not be feeling the associated feelings.)

Comment author: AnnaSalamon 17 September 2010 09:52:49PM *  6 points [-]

(Edit to add a practical example: a golfer envisions and attempts to anticipate every shot as if it were going to be a hole-in-one, even though most of them will not be... but in the process, achieves a better result than if s/he anticipated performing an average shot. Here, X is the perfect shot, and Y is the improved shot resulting from the visualization. The compartmentalization that must occur for this to work is that the "far" mind must not be allowed to break the golfer's concentration by pointing out that the envisioned shot is a lie, and that one should therefore not be feeling the associated feelings.)

It seems to me there are two categories of mental events that you are calling anticipations. One category is predictions (which can be true or false, and honest or self-deceptive); the other is declarations, or goals (which have no truth-values). To have a near-mode declaration that you will hit a hole-in-one, and to visualize it and aim toward it with every fiber of your being, is not at all the same thing as near-mode predicting that you will hit a hole-in-one (and so being shocked if you don't, betting piles of money on the outcome, etc.). But you've done more experiments here than I have; do you think the distinction between "prediction" and "declaration/aim" exists only in far mode?

Comment author: pjeby 18 September 2010 02:19:40AM *  0 points [-]

not at all the same thing as near-mode predicting that you will hit a hole-in-one (and so being shocked if you don't, betting piles of money on the outcome, etc.).

To be clear, one is compartmentalizing - deliberately separating the anticipation of "this is what I'm going to feel in a moment when I hit that hole-in-one" from the kind of anticipation that would let you place a bet on it.

This example is one of only many where compartmentalizing your epistemic knowledge from your instrumental experience is a damn good idea, because it would otherwise interfere with your ability to perform.

do you think the distinction between "prediction" and "declaration/aim" exists only in far mode?

What I'm saying is that decompartmentalization is dangerous to many instrumental goals, since epistemic knowledge of uncertainty can rob you of necessary clarity during the preparation and execution of your actual action and performance.

To perform confidently and with motivation, it is often necessary to think and feel "as if" certain things were true, which may in fact not be true.

Note, though, that with respect to the declaration/prediction divide you propose, Wiseman's luck research doesn't say anything about people declaring intentions to be lucky, AFAICT, only anticipating being lucky. This expectation seems to prime unconscious perceptual fitlers as well as automatic motivations that do not occur when people do not expect to be lucky.

I suspect that one reason this works well for vague expectations such as "luck" is that the expectation can be confirmed by many possible outcomes, and is so more self-sustaining than more-specific beliefs would be.

We can also consider Dweck and Seligman's mindset and optimism research under the same umbrella: the "growth" mindset anticipates only that the learner will improve with effort over time, and the optimist merely anticipates that setbacks are not permanent, personal, or pervasive.

In all cases, AFAICT, these are actual beliefs held by the parties under study, not "declarations". (I would guess the same also applies to the medical benefits of believing in a personally-caring deity.)

Comment author: Will_Newsome 21 September 2010 08:11:23PM 2 points [-]

What I'm saying is that decompartmentalization is dangerous to many instrumental goals, since epistemic knowledge of uncertainty can rob you of necessary clarity during the preparation and execution of your actual action and performance.

Compartmentalization only seems necessary when actually doing things; actually hitting golf balls or acting in a play or whatever. But during down time epistemic rationality does not seem to be harmed. Saying 'optimists' indicates that optimism is a near-constantly activated trait, which does sound like it would harm epistemic rationality. Perhaps realists could do as well as or better than optimists if they learned to emulate optimists only when actually doing things like golfing or acting, but switching to 'realist' mode as much as possible to ensure that the decompartmenalization algorithms are running at max capacity. This seems like plausible human behavior; at any rate, if realism as a trait doesn't allow one to periodically be optimistic when necessary, then I worry that optimism as a trait wouldn't allow one to periodically be realistic when necessary. The latter sounds more harmful, but I optimistically expect that such tradeoffs aren't necessary.

Comment author: pjeby 21 September 2010 11:53:52PM 1 point [-]

Saying 'optimists' indicates that optimism is a near-constantly activated trait, which does sound like it would harm epistemic rationality. Perhaps realists could do as well as or better than optimists if they learned to emulate optimists only when actually doing things like golfing or acting,

I rather doubt that, since one of the big differences between the optimists and pessimists is the motivation to practice and improve, which needs to be active a lot more of the time than just while "doing something".

If the choice is between, say, reading LessWrong and doing something difficult, my guess is the optimist will be more likely to work on the difficult thing, while the purely epistemic rationalist will get busy finding a way to justify reading LessWrong as being on task. ;-)

Don't get me wrong, I never said I liked this characteristic of evolved brains. But it's better not to fool ourselves about whether it's better not to fool ourselves. ;-)