Many LW readers choose to direct their charitable donations to SingInst with a view toward reducing existential risk. Others do not, whether because they feel they lack an understanding of the relevant issues, because they value present day humans more than future humans or because they have concern as to...
Announcing GiveWell Labs > We’re now launching a new initiative within GiveWell that will not be subject to either of these constraints. We plan to invest about 25% of our research time in what we’re calling GiveWell Labs: an arm of our research process that will be open to any...
Last month I was involved in a conversation thread about what the impact of a hypothetical nuclear war would be on existential risk. There are many potential nuclear war scenarios which would have varying impacts on existential risk. It's difficult to know where to start to gain an understanding of...
From Risk Analysis of Nuclear Deterrence by Martin Hellman. See also http://nuclearrisk.org/ > A full-scale nuclear war is not the only threat to humanity’s continued existence, and we should allocate resources commensurate with the various risks. A large asteroid colliding with the Earth could destroy humanity in the same way...
Points in this article emerged from a conversation with Anna Salamon I think that thinking about decimal expansions of real numbers provides a good testing ground for one's intuition about probabilities. The context of computation is very different from most of the contexts that humans deal with; in particular it's...
Edit: Carl Shulman made some remarks that have caused me to seriously question the soundness of the final section of this post. More on this at the end of the post. Consider the following two approaches to philanthropy: The “local” approach (associated with "satisficing") is to consider those philanthropic opportunities...
Related to: Confidence levels inside and outside an argument, Making your explicit reasoning trustworthy A mode of reasoning that sometimes comes up in discussion of existential risk is the following. Person 1: According to model A (e.g. some Fermi calculation with probabilities coming from certain reference classes), pursuing course of...