wedrifid comments on The Contrarian Status Catch-22 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (99)
I haven't seen you take into account the relative costs of error of the two beliefs.
A few months ago, I asked:
I think that someone who believes in many-worlds will keep drawing cards until they die. Someone who believes in one world might not. An expected-utility maximizer would; but I'm uncomfortable about playing the lottery with the universe if it's the only one we've got.
If a rational, ethical one-worlds believer doesn't continue drawing cards as long as they can, in a situation where the many-worlds believer would, then we have an asymmetry in the cost of error. Building an FAI that believes in one world, when many worlds is true, causes (possibly very great) inefficiency and repression to delay the destruction all life. Building an FAI that believes in many worlds, when one world is true, results in annihilating all life in short order. This large asymmetry is enough to compensate for a large asymmetry in probabilities.
(My gut instinct is that there is no asymmetry, and that having a lot of worlds shouldn't make you less careless with any of them. But that's just my gut instinct.)
Also, I also think that you can't, at present, both be rational about updating in response to the beliefs of others, and dismiss one-world theory as dead.
Omega clearly has more than one universe up his sleeve. It doesn't take too many doublings of my utility function before a further double would require more entropy than is contained in this one. Just how many galaxies worth of matter perfectly optimised for my benefit do I really need?
The problem here is that is hard to imagine Omega actually being able to double utility. Doubling utility is hard. It really would be worth the risk of gambling indefinitely if Omega actually had the power to do what he promised. If it isn't, then you by definition have your utility function wrong. In fact, if exactly half of the cards killed you and the other half doubled utility it would still be worth gambling unless you assign exactly 0 utility to anything else in the universe in the case of your death.
Omega knows you'll draw a skull before you get that many doublings.
That would be a different problem. Either the participant is informed that the probability distribution in question has anthropic bias based on the gamemaster's limits or the gamemaster is not Omega-like.