You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

DanielLC comments on Bayesian Doomsday Argument - Less Wrong Discussion

-5 Post author: DanielLC 17 October 2010 10:14PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (17)

You are viewing a single comment's thread. Show more comments above.

Comment author: DanielLC 18 October 2010 01:10:43AM 0 points [-]

You should weigh the importance of your choices more highly.

This doesn't mean future stuff doesn't matter; but it makes it so it's not an obvious choice.

Suppose you do something that has a chance of saving the world. Suppose there have been 100 billion people so far. The expected amount you'd do is ∫k/n dn = ln(n2/n1) If there's less than 200 billion people, that's k ln 2. If it's less than 210^40, that's k ln 210^29. It works out to being about 100 times as important. That seems like a lot, but charity tends to work in orders of magnitude difference.

I'm not sure how good a value 10^40 is, but I think the order of magnitude is within a factor of two, so the predicted value would be within that.