R0k0
R0k0 has not written any posts yet.

If the human race ends soon, there will be fewer people. Therefore, assign a lower prior to that. This cancels exactly the contribution from the doomsday argument.
Essentially the only consistent low-level rebuttal to the doomsday argument is to use the self indication assumption (SIA).
What about rejecting the assumption that there will be finitely many humans? In the infinite case, the argument doesn't hold.
seems an arbitrary limit.
Your axiology is arbitrary. Everyone has arbitrary preferences, and arbitrary principles that generate preferences. You are arbitrary - you can either live with that or self-modify into something much less arbitrary like a fitness maximizer, and lose your humanity.
I think that the answer to this conundrum is to be found in Joshua Greene's dissertation. On page 202 he says:
"The mistake philosophers tend to make is in accepting rationalism proper, the view that our moral intuitions (assumed to be roughly correct) must be ultimately justified by some sort of rational theory that we’ve yet to discover ... a piece of moral theory with justificatory force and not a piece of psychological description concerning patterns in people’s emotional responses."
When Eliezer presents himself with this dilemma, the neural/hormonal processes in his mind that govern reward and decisionmaking fire "Yes!" on each of a series of decisions that end up, in aggregate, losing him... (read more)
see Bostrom's paper