If he tells every person the results of all the other trials, I am in effect a random person from all the persons in all the trials, because everyone is treated equally. Let's suppose there were just 2 trials, in order to simplify the math. Starting with the prior probabilities based on the coin toss, there is a 25% chance of a total of just 2 observers behind red doors, in which case I would have a 100% chance of being behind a red door. There is a 50% chance of 1 observer behind a red door and 99 observers behind blue doors, which would give me a 99% chance of being behind a blue door. There is a 25% chance of 198 observers behind blue doors, which would give me a 100% chance of being behind a blue door. So my total prior probabilities are 25.5% of being behind a red door, and 74.5% of being behind a blue door.
Let's suppose I am told that the other trial resulted in just one observer behind a red door. First we need the prior probability of being told this. If there were two red doors (25% chance), there would be a 100% chance of this. If there were two blue doors (25% chance), there would be a 0% chance of this. If there was a red door and a blue door (50% chance), there would be a 99% chance of this. So the total prior probability of being told that the other trial resulted in a red door is again 74.5%, and the probability of being told that the other trial resulted in a blue door is 25.5%.
One more probability: given that I am behind a red door, what is the probability that I will be told that the other trial resulted in an observer behind a red door? There was originally a 25% chance of two red trials, and a 50% chance of 1 red and 1 blue trial. This implies that given that I am behind a red door, there is a 1/3 chance that I will be told that the other trial resulted in red, and a 2/3 that I will be told that the other trial resulted in blue. (Once again things will change if we run more trials, for similar reasons, because in the 1/3 case, there are 2 observers behind red doors.)
Applying Bayes' theorem, then, the probability that I am behind a red door given that I am told that the other trial resulted in an observer behind a red door, is (.255 / .745) x (1/3) = approximately 11.4%. So the probability that I am behind a blue door is approximately 88.6%. Since it was originally only 74.5% with two trials, information about the other trial did contribute to knowledge of my door. The same will happen as you add more trials and more information.
Well you very nearly ruined my weekend. :-)
I admit I was blind sided by the possibility that information about the other trials could yield information about your door. I'll have to review the monty hall problem.
Using your methods, I got:
Being blue given told red=(.745 being blue prior/.745 told red prior) x (2/3 told red given blue)=.666...
Which doesn't match your 11.4%, so something is missing.
In scenario F, if you're not told, why assume that your trial was the only one in the set? You should have some probability that the omegas would do this more than once.
EDIT: This post has been superceeded by this one.
The doomsday argument, in its simplest form, claims that since 2/3 of all humans will be in the final 2/3 of all humans, we should conclude it is more likely we are in the final two thirds of all humans who’ve ever lived, than in the first third. In our current state of quasi-exponential population growth, this would mean that we are likely very close to the final end of humanity. The argument gets somewhat more sophisticated than that, but that's it in a nutshell.
There are many immediate rebuttals that spring to mind - there is something about the doomsday argument that brings out the certainty in most people that it must be wrong. But nearly all those supposed rebuttals are erroneous (see Nick Bostrom's book Anthropic Bias: Observation Selection Effects in Science and Philosophy). Essentially the only consistent low-level rebuttal to the doomsday argument is to use the self indication assumption (SIA).
The non-intuitive form of SIA simply says that since you exist, it is more likely that your universe contains many observers, rather than few; the more intuitive formulation is that you should consider yourself as a random observer drawn from the space of possible observers (weighted according to the probability of that observer existing).
Even in that form, it may seem counter-intuitive; but I came up with a series of small steps leading from a generally accepted result straight to the SIA. This clinched the argument for me. The starting point is:
A - A hundred people are created in a hundred rooms. Room 1 has a red door (on the outside), the outsides of all other doors are blue. You wake up in a room, fully aware of these facts; what probability should you put on being inside a room with a blue door?
Here, the probability is certainly 99%. But now consider the situation:
B - same as before, but an hour after you wake up, it is announced that a coin will be flipped, and if it comes up heads, the guy behind the red door will be killed, and if it comes up tails, everyone behind a blue door will be killed. A few minutes later, it is announced that whoever was to be killed has been killed. What are your odds of being blue-doored now?
There should be no difference from A; since your odds of dying are exactly fifty-fifty whether you are blue-doored or red-doored, your probability estimate should not change upon being updated. The further modifications are then:
C - same as B, except the coin is flipped before you are created (the killing still happens later).
D - same as C, except that you are only made aware of the rules of the set-up after the people to be killed have already been killed.
E - same as C, except the people to be killed are killed before awakening.
F - same as C, except the people to be killed are simply not created in the first place.
I see no justification for changing your odds as you move from A to F; but 99% odd of being blue-doored at F is precisely the SIA: you are saying that a universe with 99 people in it is 99 times more probable than a universe with a single person in it.
If you can't see any flaw in the chain either, then you can rest easy, knowing the human race is no more likely to vanish than objective factors indicate (ok, maybe you won't rest that easy, in fact...)
(Apologies if this post is preaching to the choir of flogged dead horses along well beaten tracks: I was unable to keep up with Less Wrong these past few months, so may be going over points already dealt with!)
EDIT: Corrected the language in the presentation of the SIA, after SilasBarta's comments.
EDIT2: There are some objections to the transfer from D to C. Thus I suggest sliding in C' and C'' between them; C' is the same as D, execpt those due to die have the situation explained to them before being killed; C'' is the same as C' except those due to die are told "you will be killed" before having the situation explained to them (and then being killed).