I don't think this requires anthropic reasoning.
Here is a variation on the story:
...One day, you and the presumptuous philosopher are walking along, arguing about the size of the universe, when suddenly Omega jumps out from behind a bush and knocks you both out with a crowbar. While you're unconscious, she builds a hotel with 1,000,001 rooms. Then she makes a million copies of both of you, sticks them all in rooms, and destroys the originals.
You wake up in a hotel room, in bed with the presumptuous philosopher, with a note on the table from Omega, explainin
One difference between this and universes is that you can't be in two hotels, but you might be able to exist in two different models of the universe.
You run out of the room to find yourself in a huge, ten thousand story attrium, filled with throngs of yourselves and smug looking presumptuous philosophers.
One of the other copies just got $10 bucks, you lost nothing. Nice work bluffing your presumptuous friend and pumping his ego for (a chance at) cash. I just hope you think things through a bit more thoroughly if you have to lay cash on the line. Or that you have good reason to be valuing the outcome of the one copy equal to that of the million in the other hotel.
This is a trivial problem that need n...
I wonder... could we please use Omega less often unless absolutely required? (and if absolutely required it strongly suggests something is wrong with the story anyway)
While you're unconscious, she builds two hotels, one with a million rooms, and one with just one room. Then she makes a million copies of both of you, sticks them all in rooms, and destroys the originals.
I feel... thin. Sort of stretched, like... butter scraped over too much bread.
Why do we spend so much time thinking about how to reason on problems in which
a) you know what's going on while you're not conscious, and
b) you take at face value information fed to you by a hostile entity?
This entire theoretical framework is based on the assumption that "she makes a million copies of both of you, sticks them all in rooms, and destroys the originals" is meaningfully possible, which it may not be, and that it would result in a "you" that is somehow continuous, which is not clear, and may not be experimentally verifiable.
And of course, if you ever encountered an Omega hypothetical in real life, you'd decide that "He's lying" has P~=1. Perhaps that's why Omega keeps getting used; all Omega hypotheticals have that property in common, I believe.
One day, you and the presumptuous philosopher are walking along, arguing about the size of the universe, when suddenly Omega jumps out from behind a bush and knocks you both out with a crowbar. While you're unconscious, she builds two hotels, one with a million rooms, and one with just one room. Then she makes a million copies of both of you, sticks them all in rooms, and destroys the originals.
You wake up in a hotel room, in bed with the presumptuous philosopher, with a note on the table from Omega, explaining what she's done.
"Which hotel are we in, I wonder?" you ask.
"The big one, obviously" says the presumptuous philosopher. "Because of anthropic reasoning and all that. Million to one odds."
"Rubbish!" you scream. "Rubbish and poppycock! We're just as likely to be in any hotel omega builds, regardless of the number of observers in that hotel."
"Unless there are no observers, I assume you mean" says the presumptuous philosopher.
"Right, that's a special case where the number of observers in the hotel matters. But except for that it's totally irrelevant!"
"In that case," says the presumptuous philosopher, "I'll make a deal with you. We'll go outside and check, and if we're at the small hotel I'll give you ten bucks. If we're at the big hotel, I'll just smile smugly."
"Hah!" you say. "You just lost an expected five bucks, sucker!"
You run out of the room to find yourself in a huge, ten thousand story attrium, filled with throngs of yourselves and smug looking presumptuous philosophers.