Rain comments on Thoughts on the Singularity Institute (SI) - Less Wrong

256 Post author: HoldenKarnofsky 11 May 2012 04:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1270)

You are viewing a single comment's thread. Show more comments above.

Comment author: taw 10 May 2012 06:04:43PM 3 points [-]

Existential risk reduction is a very worthy cause. As far as I can tell there are a few serious efforts - they have scenarios which by outside view have non-negligible chances, and in case of many of these scenarios these efforts make non-negligible difference to the outcome.

Such efforts are:

  • asteroid tracking
  • seed vaults
  • development of various ways to deal with potential pandemics (early tracking systems, drugs etc.) - this actually overlaps with "normal" medicine a lot
  • arguably, global warming prevention is a borderline issue, since there is a tiny chance of massive positive feedback loops that will make Earth nearly uninhabitable. These chances are believed to be tiny by modern climate science, but all chances for existential risk are tiny.

That's about the entire list I'm aware of (are there any others?)

And then there's huge number of efforts which claim to do something based on existential risk, but either theories behind risk they're concerning themselves with, or theories behind why their efforts are likely to help, are based on assumptions not shared by vast majority of competent people.

All FAI-related stuff suffers from both of these problems - their risk is not based on any established science, and their answer is even less based in reality. If it suffered from only one of these problems it might be fixable, but as far as I can tell it is extremely unlikely to join the category of serious efforts ever.

The best claim those non-serious effort can make is that tiny chance that the risk is real * tiny change the organization will make a difference * huge risk is still a big number, but that's not a terribly convincing argument.

I'm under impression that we're doing far less than everything we can with these serious efforts, and we haven't really identified everything that can be dealt with with such serious effort. We should focus there (and on a lot of things which are not related to existential risk).

Comment author: Rain 10 May 2012 09:10:51PM *  3 points [-]
Comment author: taw 10 May 2012 10:16:44PM 2 points [-]

Most of entries on the list are either not quantifiable even approximately to within order of magnitude. Of those that are (which is pretty much only "risks from nature" in Bostrom's system) many are still bad candidates for putting significant effort into, because:

  • we either have little ways to deal with them (like nearby supernova explosions)
  • we have a lot of time and future will be better equipped to deal with them (like eventual demise of Sun)
  • they don't actually seem to get anywhere near civilization-threatening levels (like volcanoes)

About the only new risk I see on the list which can and should be dealt with is having some backup plans for massive solar flares, but I'm not sure what we can do about it other than putting some extra money into astrophysics departments so they can figure things out better and give us better estimates.