Rain comments on Efficient philanthropy: local vs. global approaches - Less Wrong

8 Post author: multifoliaterose 16 June 2011 04:21AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (13)

You are viewing a single comment's thread. Show more comments above.

Comment author: Rain 16 June 2011 05:20:03PM *  8 points [-]

For now, the question I'm left with is - what OTHER existential risks are out there, how cost effective are they to fix, and do we have an adequate metric to judge our success?

The edited volume Global Catastrophic Risks addresses this question. It's far more extensive than Nick Bostrom's initial Existential Risks paper and provides a list of further reading after each chapter.

Here are some of the covered risks:

  • Astro-physical processes such as the stellar lifecycle
  • Human evolution
  • Super-volcanism
  • Comets and asteroids
  • Supernovae, gamma-ray bursts, solar flares, and cosmic rays
  • Climate change
  • Plagues and pandemics
  • Artificial Intelligence
  • Physics disasters
  • Social collapse
  • Nuclear war
  • Biotechnology
  • Nanotechnology
  • Totalitarianism

The book also has many chapters discussing the analysis of risk, risks and insurance, prophesies of doom in popular narratives, cognitive biases relating to risk, selection effects, and public policy.

Comment author: prase 16 June 2011 05:50:49PM 1 point [-]

Physics disasters

What's physics disasters?

Comment author: Rain 16 June 2011 05:56:36PM *  7 points [-]

What's physics disasters?

Breakdown of the vacuum state, conversion of matter into strangelets, mini black-holes, and other things which people fear from a particle accelerator like the LHC. It boils down to, "Physics is weird, and we might find some way of killing ourselves by messing with it."

Comment author: amcknight 03 July 2012 01:53:41AM 0 points [-]

What is the risk from Human Evolution? Maybe I should just buy the book...

Comment author: Rain 03 July 2012 05:04:34PM *  1 point [-]

It's well-written, though depressing, if you take "only black holes will remain in 10^45 years" as depressing news.

Evolution is not a forward-looking algorithm, so humans could evolve in dangerous, retrograde ways, and thus extinct what we currently consider valuable about ourselves, or even the species itself should it become too dependent on current conditions.