What is the reasonable probability you think I should assign to the proposition by some bunch of guys (with at most some accomplishments in highly non-gradable field of philosophy) led by a person with no formal education nor prior job experience nor quantifiable accomplishments, that they should be given money to hire more people to develop their ideas on how to save the world from a danger they are most adept at seeing? The prior here is so laughably low you can hardly find a study so flawed it wouldn't be a vastly greater explanation for the SI behavior than it's mission statement taken at face value, even if we do not take into account SI's prior record.
What is this, the second coming of C.S. Lewis and his trilemma? SI must either be completely right and demi-gods who will save us all or they must be deluded fools who suffer from some psychological bias - can you really think of no intermediates between 'saviors of humanity' and 'deluded fools who cannot possibly do any good', which might apply?
I just wanted to point out that invoking DK is an incredible abuse of psychological research and does not reflect well on either you or Dymtry, and now you want me to justify SI entirely...
The lack of other alternatives is evidence against SI's cause.
Alternatives would also be evidence against donating, too, since what makes you think they are the best one out of all the alternatives? Curious how either way, one should not donate!
Nick Szabo on acting on extremely long odds with claimed high payoffs:
Beware of what I call Pascal's scams: movements or belief systems that ask you to hope for or worry about very improbable outcomes that could have very large positive or negative consequences. (The name comes of course from the infinite-reward Wager proposed by Pascal: these days the large-but-finite versions are far more pernicious). Naive expected value reasoning implies that they are worth the effort: if the odds are 1 in 1,000 that I could win $1 billion, and I am risk and time neutral, then I should expend up to nearly $1 million dollars worth of effort to gain this boon. The problems with these beliefs tend to be at least threefold, all stemming from the general uncertainty, i.e. the poor information or lack of information, from which we abstracted the low probability estimate in the first place: because in the messy real world the low probability estimate is almost always due to low or poor evidence rather than being a lottery with well-defined odds.
Nick clarifies in the comments that he is indeed talking about singularitarians, including his GMU colleague Robin Hanson. This post appears to revisit a comment on an earlier post:
In other words, just because one comes up with quasi-plausible catastrophic scenarios does not put the burden of proof on the skeptics to debunk them or else cough up substantial funds to supposedly combat these alleged threats.