Daniel_Burfoot comments on Changing accepted public opinion and Skynet - Less Wrong

15 [deleted] 22 May 2009 11:05AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (58)

You are viewing a single comment's thread.

Comment author: Daniel_Burfoot 24 May 2009 02:20:51PM 0 points [-]

One central problem is that people are constantly deluged with information about incipient crises. The Typical Person can not be expected to understand the difference in risk levels indicated by UFAI vs. bioterror vs. thermonuclear war vs global warming, and this is not even a disparagement of the Typical Person. These risks are just impossible to estimate.

But how can we deal with this multitude of potential disasters? Each disaster has some low probability of occurring, but because there are so many of them (swine flu, nuclear EMP attacks, grey goo, complexity... ) we are almost certainly doomed, unless we do something clever. Even if we take preventative measures sufficient to eliminate the risk of one problem (presumably at some enormous expense), we will just get smashed by the next one on the list.

Meta-strategy: find strategies that help defend against all sources of existential risk simultaneously. Candidates:

  • moon base
  • genetic engineering of humans to be smarter and more disease-resistant
  • generic civilizational upgrades, e.g. reducing traffic and improving the economy
  • simplification. There is no fundamental reason why complexity always has to increase. Almost everything can be simplified: the law, the economy, software.