shokwave comments on The elephant in the room, AMA - Less Wrong

22 Post author: calcsam 12 May 2011 02:59PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (428)

You are viewing a single comment's thread.

Comment author: shokwave 12 May 2011 06:12:52PM 11 points [-]

I'm interested in the power of your belief. For example, I believe strongly that, say, Michael Vassar is smart. I also believe strongly that the laws of physics hold everywhere. If these two beliefs were brought into conflict (say, Michael Vassar presented me with a perpetual motion machine blueprint) physics would win, because it's more powerful.

In that vein, I would like to take some of your time to ask you to come up with a quick power ranking of some of your deep beliefs. If your religion came into direct conflict with your faith, say? (I am not sure this is a fair question, actually - I personally can't imagine what would happen if my rationality came into conflict with my sense of truth, because they're so similar).

Comment author: Eugine_Nier 12 May 2011 06:18:36PM 7 points [-]

I'm interested in the power of your belief. For example, I believe strongly that, say, Michael Vassar is smart. I also believe strongly that the laws of physics hold everywhere. If these two beliefs were brought into conflict (say, Michael Vassar presented me with a perpetual motion machine blueprint) physics would win, because it's more powerful.

Your concept of the power of a belief sounds a lot like its probability.

Comment author: shokwave 12 May 2011 06:38:17PM *  8 points [-]

That's because it is. Yes, the way I described power rankings working, it is isomorphic to this:

Bayesian agent has two beliefs X and Y. If it discovered that X and Y are evidence against each other ( Pr(X | Y) < Pr(X) & Pr(Y | X) < Pr(Y) ) which belief will be updated more?

which is isomorphic to

How much evidence for X and how much for Y?

but those questions don't cause most human brains to give good answers.

Comment author: Dorikka 13 May 2011 04:57:23PM 1 point [-]

I think that thinking in terms of probability is going to be more conducive to careful thinking instead of thinking in terms of power. We've got a lot of emotional connections and alternative definitions for the second word which we don't really want interfering with our reasoning when we speak of probability.

Comment author: Eliezer_Yudkowsky 14 May 2011 04:59:01AM 3 points [-]

I kinda disagree here. If you show me an exact Bayesian network, I can read off it the degree to which evidence for one proposition is evidence against another. If you don't give an exact interpretation in probability theory, then isn't talking about "probability" instead of "power" just pretending to precision? Jumping to "probability" is something that has to be earned, and to me it's not yet obvious that for all Bayesian graphs, if P(A) > P(B) > 0.5, then learning the truth of a descendant node which proves !(A & B) will cause B to decrease in probability more than A.

Comment author: paulfchristiano 14 May 2011 05:07:13AM 2 points [-]

and to me it's not yet obvious that for all Bayesian graphs, if P(A) > P(B) > 0.5, then learning the truth of a descendant node which proves !(A & B) will cause B to decrease in probability more than A.

Consider learning "not A," for example.

Comment author: Dorikka 14 May 2011 05:33:39AM 0 points [-]

The tradeoff occurring here seems to be reducing the possibility of triggering biases versus reducing the possibility that you're fooling yourself into thinking that you're thought is more precise than it really is. I would go with the first; if I felt that I was being insufficiently precise in a certain situation, I could use a couple checks, such as seeing whether it managed to distinguish fiction from reality effectively.

On a more concrete note, I read this:

If these two beliefs were brought into conflict (say, Michael Vassar presented me with a perpetual motion machine blueprint) physics would win, because it's more powerful.

as judging that if he estimated P(A)>P(B), P(A) would remain greater than P(B) given !(A&B), not as saying that !(A&B) was stronger evidence against B than against A.

Comment author: calcsam 13 May 2011 01:37:36AM 2 points [-]

If your religion came into direct conflict with your faith.

Confused. What do you mean exactly? (Did you mean to type 'your reason'? Or something else?)

Comment author: shokwave 13 May 2011 05:41:10AM 11 points [-]

I make a few presumptions here; correct me if I'm wrong.

I presume you do not simply have total faith in everything Latter Day Saints; you don't experience a sense of rightness on every single line of every single religious text (I've never met a religious person who does; this is something that only happens in strawman atheism arguments). But presumably you also have experienced a sense of rightness regarding some large part of LDS theology (again, based off my experiences with religious people), as that would be why you converted.

Now here's the tricky part. If you read something that struck you as right - you got that sense of rightness about it - but when you shared it you found it was directly contradicting some doctrine of LDS, what would happen? Would you stop thinking the thing was right, or would you adjust your view of the LDS Church slight downwards?

(The reason I am not sure this is fair is because if you asked me the same question in terms of rationality and truth-feeling, I would have a hard time not picking it apart, although in the least convenient possible world I would closely examine both my rationality and my feeling of truthness, and then rationality would win.)