Benja comments on Earning to Give vs. Altruistic Career Choice Revisited - Less Wrong

34 Post author: JonahSinick 02 June 2013 02:55AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (154)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 28 May 2013 11:10:16PM 1 point [-]

If you're one of 10^11 sentients to be born on Ancient Earth with a golden opportunity to influence a roughly 10^80-sized future, what exactly is a 'vanishing chance'... eh, let's all save it until later.

Comment author: Benja 28 May 2013 11:56:57PM *  10 points [-]

I meant that the alieved probability is small in absolute terms, not that it is small compared to the payoff. That's why I mentioned the "stick to the mainline probability" heuristic. I really do believe that there are many people who, if they alieved that they (or a group effort they could join) could change the probability of a 10^80-sized future by 10%, would really care; but who do not alieve that the probability is large enough to even register, as a probability; and whose brains will not attempt to multiply a not-even-registering probability with a humongous payoff. (By "alieving a probability" I simply mean processing the scenario the way one's brain processes things it assigns that amount of credence, not a conscious statement about percentages.)

This is meant as a statement about people's actual reasoning processes, not about what would be reasonable (though I did think that you didn't feel that multiplying a very small success probability with a very large payoff was a good reason to donate to MIRI; in any case seems to me that the more important unreasonableness is requesting mountains of evidence before alieving a non-vanishing probability for weird-sounding things).

[ETA: I find it hard to put a number on the not-even-registering probability the sort of person I have in mind might actually alieve, but I think a fair comparison is, say, the "LHC will create black holes" thing -- I think people will tend to process both in a similar way, and this does not mean that they would shrug it off if somebody counterfactually actually did drop a mountain of evidence about either possibility on their head.]

Comment author: Eliezer_Yudkowsky 29 May 2013 10:48:00PM 5 points [-]

though I did think that you didn't feel that multiplying a very small success probability with a very large payoff was a good reason to donate to MIRI

Because on a planet like this one, there ought to be some medium-probable way for you and a cohort of like-minded people to do something about x-risk, and if a particular path seems low probability, you should look for one that's at least medium-probability instead.

Comment author: Benja 29 May 2013 10:57:04PM 2 points [-]

Ok, fair enough. (I had misunderstood you on that particular point, sorry.)