Will_Newsome comments on Abnormal Cryonics - Less Wrong

56 Post author: Will_Newsome 26 May 2010 07:43AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (365)

You are viewing a single comment's thread. Show more comments above.

Comment author: Will_Newsome 26 May 2010 11:02:41AM *  1 point [-]

Correct: it is simply an argument against certainty in either direction. It is the certainty that I find worrisome, not the conclusion. Now that I look back, I think I failed to duly emphasize the symmetry of my arguments.

Comment author: Vladimir_Nesov 26 May 2010 11:18:26AM 0 points [-]

And which way is certainty? There is no baseline in beliefs, around the magical "50%". When a given belief diminishes, its opposite grows in strength. At which point are they in balance? Is the "normal" level of belief the same for everything? Russell's teapot? The sky is blue?

Comment author: Will_Newsome 26 May 2010 11:41:31AM *  0 points [-]

Here I show my ignorance. I thought that I was describing the flattening of a probability distribution for both the propositions 'I will reflectively endorse that signing up for cryonics was the best thing to do' and 'I will reflectively endorse that not signing up for cryonics was the best thing to do'. (This is very different from the binary distinction 'Signing up for cryonics is the current best course of action' and 'Not signing up for cryonics is the best current course of action'.) You seem to be saying that this is meaningless because I am not flattening the distributions relative to anything else, whereas I have the intuition that I should be flattening them towards the shape of some ignorance prior (I would like to point out that I am using technical terms I do not fully understand here: I am a mere novice in Bayesian probability theory (as distinct from Bayesianism)). I feel like you have made a valid point but that I am failing to see it.

Comment author: steven0461 26 May 2010 08:35:28PM *  4 points [-]

So it looks like what's going on is you have estimates for U(cryonics) and U(not cryonics), and structural confusion increases the variance for both these utilities, and Vladimir is saying this doesn't change the estimate of U(cryonics) - U(not cryonics), and you're saying it increases P(U(not cryonics) > U(cryonics)) if your estimate of U(cryonics) starts out higher, and both of you are right?

Comment author: Will_Newsome 26 May 2010 08:42:46PM 0 points [-]

That seems correct to me.

Comment author: Will_Newsome 26 May 2010 12:27:12PM *  0 points [-]

This is a try at resolving my own confusion:

Suppose there is a fair coin that is going to flipped, and I have been told that it is biased towards heads, so I bet on heads. Suppose that I am then informed that it is in fact biased in a random direction: all of a sudden I should reconsider whether I think betting on heads is the best strategy. I might not decide to switch to tails (cost of switching, and anyway I had some evidence that heads was the direction of bias even if it later turned out to be less-than-totally-informative), but I will move the estimate of my success a lot closer to 50%.

I seem to be arguing that when there's a lot of uncertainty about the model I should assume any given P and not-P are equally likely, because this seems like the best ignorance prior for a binary event about which I have very little information. When one learns there is a lot of structural/metaphysical uncertainty around the universe, identity, et cetera, one should revise their probabilities of any given obviously relevant P/not-P pair towards 50% each, and note that they would not be too surprised by any result being true (as they're expecting anything of everything to happen).