You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Perplexed comments on Link: The Uncertain Future - "The Future According to You" - Less Wrong Discussion

6 Post author: XiXiDu 13 January 2011 05:44PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (7)

You are viewing a single comment's thread. Show more comments above.

Comment author: Perplexed 14 January 2011 04:47:17PM 1 point [-]

...which almost suggests to me that maybe the optimal political policy to advocate is for things that reduce the likelihood and scope of prosaic disasters ...

Am I reading my results wrong?

No, I think you are reading them right. If your projection places the AI singularity more than a few decades out (given business-as-usual), then some other serious civilization-collapsing disaster is likely to arise before the FAI arrives to save us.

But, the scenario that most frightens me is that the near-prospect of an FAI might itself be the trigger for a collapse - due to something like the 'Artilect War' of de Garis.

Comment author: timtyler 14 January 2011 09:10:12PM *  1 point [-]

De Garis has luddites on one side of his war. That group has historically been impoverised and has lacked power. The government may well just declare them to be undesirable terrorists - and stomp on them - if they start causing trouble.

We can see the environmental movement today. They are usually fairly peace-loving. It doesn't seem terribly likely to me that their descendants will go into battle.