Desrtopa comments on [SEQ RERUN] Mere Messiahs - Less Wrong

2 Post author: MinibearRex 13 November 2011 06:54AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (15)

You are viewing a single comment's thread. Show more comments above.

Comment author: TheOtherDave 13 November 2011 02:15:38PM 2 points [-]

If I were actually willing to risk all of my remaining observer-moments in order to (e.g.) shelter Jews from Hitler, and my actual willingness to do that were not noticeably affected by (e.g.) signing up for cryonics, I would probably conclude that I don't actually believe that signing up for cryonics significantly increases my expected number of observer-moments, but rather was experiencing a "belief in belief" in immortality through cryonics.

Comment author: Desrtopa 13 November 2011 02:26:21PM 1 point [-]

You could genuinely believe it, and be willing to make predictions or even bets based on your belief, but that doesn't mean you've internalized it.

Comment author: TheOtherDave 13 November 2011 04:34:23PM 0 points [-]

I agree that there's a distinction here, though it strikes me as one of degree rather than kind.

I would say the same thing about the two dragon-claiming, garage-owning neighbors... in both cases, their minds fail to associate representations of the dragon in their garage with various other representations that would constrain their behavior in various ways. Whether we call that "belief in belief" or "failure to internalize" or "not thinking it through" or "being confused" or "not noticing the implications" or "failing to be a tactical genius" or "being really stupid" depends on a lot of different things.

That said, I don't think the purely labeling question matters much; I'm happy to adopt your preferred labels if it facilitates communication.

If I'm understanding your comment correctly, you're suggesting that the threshold between "belief in belief" and "failure to internalize" in this case has to do with the willingness to make predictions/bets -- e.g., if I'm willing to give someone a large sum of money in exchange for a reliable commitment to give me a much much larger sum of money after I am restored from cryonic suspension, then we say I have a "genuine belief" in cryonics and not a mere "belief in belief", although I might still failed to have an "internalized belief"... is that right?

If so, then sure, I agree... in the situation I describe, I might have a genuine but non-internalized belief in cryonics.

Comment author: Eugine_Nier 13 November 2011 08:17:35PM 1 point [-]

I would argue that the only difference between "belief in belief" and "failure to internalize" is whether the belief in question corresponds to external reality. The state of the brain is exactly the same in both situations.

Comment author: TheOtherDave 14 November 2011 01:54:31AM 1 point [-]

What does external reality have to do with it? Can I not have belief in belief in a proposition that happens to describe reality?

Comment author: Desrtopa 13 November 2011 04:38:59PM 0 points [-]

If I'm understanding your comment correctly, you're suggesting that the threshold between "belief in belief" and "failure to internalize" in this case has to do with the willingness to make predictions/bets -- e.g., if I'm willing to give someone a large sum of money in exchange for a reliable commitment to give me a much much larger sum of money after I am restored from cryonic suspension, then we say I have a "genuine belief" in cryonics and not a mere "belief in belief", although I might still failed to have an "internalized belief"... is that right?

Sounds right.

Comment author: TheOtherDave 13 November 2011 04:56:17PM 0 points [-]

Cool.

Having clarified that: can you say more about why the distinction (between belief in belief in cryonics, and genuine but non-internalized belief in cryonics) is important in this context? That is... why do you bring it up?

Comment author: Desrtopa 13 November 2011 05:29:39PM 0 points [-]

Well, if someone had belief in belief in cryonics, they might say that cryonics would preserve people and allow them to be brought back in the future, but every time they have to make a falsifiable prediction based on it, they'll find an excuse to avoid backing that prediction. If they're willing to make falsifiable predictions based on their belief, but go on behaving the same as if they believed they only had an expectation of living several decades, they probably only have a far mode belief in cryonics.

It takes different input to bring a far mode belief into near mode than to convince someone to actually believe something they formerly only had belief in belief in.

Comment author: endoself 14 November 2011 02:51:49AM 1 point [-]

It seems like the major difference here is compartmentalization. Someone who only takes a belief into account when it is explicitly called to their attention has an belief that is not internalized.