orthonormal comments on Compartmentalization in epistemic and instrumental rationality - Less Wrong

77 Post author: AnnaSalamon 17 September 2010 07:02AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (121)

You are viewing a single comment's thread. Show more comments above.

Comment author: AnnaSalamon 17 September 2010 05:53:42PM *  21 points [-]

rwallace, as mentioned by whpearson, notes possible risks from de-compartmentalization:

Human thought is by default compartmentalized for the same good reason warships are compartmentalized: it limits the spread of damage.... We should think long and hard before we throw away safety mechanisms, and compartmentalization is one of the most important ones.

I agree that if you suddenly let reason into a landscape of locally optimized beliefs and actions, you may see significant downsides. And I agree that de-compartmentalization, in particular, can be risky. Someone who believes in heaven and hell but doesn’t consider that belief much will act fairly normally; someone who believes in heaven and hell and actually thinks about expected consequences might have fear of hell govern all their actions.

Still, it seems to me that it is within the reach of most LW-ers to skip these downsides. The key is simple: the downsides from de-compartmentalization stem from allowing a putative fact to overwrite other knowledge (e.g., letting one’s religious beliefs overwrite knowledge about how to successfully reason in biology, or letting a simplified ev. psych overwrite one’s experiences of what dating behaviors work). The solution is thus to be really damn careful not to let new claims overwrite old data.

That is: Listen to everything you know, including implicit, near-mode beliefs and desires. Be careful not to block contrary intuitions from view. Be careful not to decide ahead of time that your verbal/symbolic beliefs are accurate and your near-mode mistaken, but to instead hug the query, ask where your intuitions are coming from, and keep your feelings and intuitions in view whether or not you know their source. Don’t let ideology or theory overwrite experience. And keep complex models (“evidence A points to X, while B would seem surprising if X were true...”), rather than rounding your evidence to its approximate conclusion.

Also, especially while you’re building these skills, ask yourself what most people would do in this circumstance, or what you would do with more compartmentalization. And then, if it seems like a better bet, do that thing. Eliezer discusses this as remembering that it all adds up to normality.

Someone should really write a top-level post about relevant safety skills. Phil’s was good; more would be better. Safety skills are important not only for reducing downsides, but also for allowing people to be less afraid, and so more able to acquire (the huge benefits of) rationality.

Comment author: orthonormal 19 September 2010 01:03:52AM 18 points [-]

This reminds me:

When I finally realized that I was mistaken about theism, I did one thing which I'm glad of– I decided to keep my system of ethics until I had what I saw as really good reasons to change bits of it. (This kept the nihilist period I inevitably passed through from doing too much damage to me and the people I cared about, and of course in time I realized that it was enough that I cared about these things, that the universe wasn't requiring me to act like a nihilist.)

Eventually, I did change some of my major ethical beliefs, but they were the ones that genuinely rested on false metaphysics, and not the ones that were truly a part of me.