fractalman comments on Counterfactual Mugging - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (258)
It's not capricious in the sense you give: you are capable of predicting some of its actions: because it's assumed Omega is perfectly trustworthy, you can predict with certainty what it will do if it tells you what it will do.
So, if it says it'll give you 10k$ in some condition (say, if you one-box its challenge), you can predict that it'll give it the money if that condition arises.
If it were capricious in the sense of complete inability of being predicted, it might amputate three of your toes and give you a flower garland.
Note that the problem supposes you do have certainty that Omega is trustworthy; I see no way of reaching that epistemological state, but then again I see no way Omega could be omnipotent, either.
On an somewhat unrelated note, why would Omega ask you for 100$ if it had simulated you wouldn't give it the money? Also, why would it do the same if it had simulated you would give it the money? What possible use would an omnipotent agent have for 100$?
Omega is assumed to be mildly bored and mildly anthropic. And his asking you for 100$ could always be PART of the simulation.
Yes, it’s quite reasonable that if it was curious about you it would simulate you and ask the simulation a question. But once it did that, since the simulation was perfect, why would it waste the time to ask the real you? After all, in the time it takes you to understand Omega’s question it could probably simulate you many times over.
So I’m starting to think that encountering Omega is actually pretty strong evidence for the fact that you’re simulated.
Maybe Omega recognizes in advance that you might think this way, doesn't want it to happen, and so precommits to asking the real you. With the existence of this precommitment, you may not properly make this reasoning. Moreover, you should be able to figure out that Omega would precommit, thus making it unnecessary for him to explicitlyy tell you he's doing so.
(Emphasis mine.)
I don’t think, given the usual problem formulation, that one can figure out what Omega wants without Omega explicitly saying it, and maybe not even in that case.
It’s a bit like a deal with a not-necessarily-evil devil. Even if it tells you something and you’re sure it’s not lying and you think you the wording is perfectly clear, you should still assign a very high probability that you have no idea what’s really going on and why.