You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

ChristianKl comments on Open thread, Jan. 18 - Jan. 24, 2016 - Less Wrong Discussion

4 Post author: MrMind 18 January 2016 09:42AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (201)

You are viewing a single comment's thread. Show more comments above.

Comment author: turchin 21 January 2016 09:34:20PM 0 points [-]

My criticism should concentrate on two levels: on his wording and on his model of x-risks and their prevention. His wording is ambiguous than he speak about tens of thousand years - we don't have them.

But I also think that his claims that we have 100 years (with small probability of extinction) and that space colonies are our best chance are both false.

Firstly, because we need strong AI and nanotech to create really self-sustained colony. Self-replicting robots are the best way to build colonies. So we need to prevent risks of AI and nanotech before we create such colonies. And I think that strong AI will be created in less than 100 years. The same maybe said about most other risks - we could create new flu virus even now without any new technologies. Global catastrophe is almost certain in next 100 years if we don't implement protective measures here on Earth.

The space colonies will not be safe from UFAI and from nanobots. Large space crafts maybe used as kinetic weapon against planets, so space exploration could create new risks. Space colonies also will not be safe from internal conflicts, as large colony will be able to create nukes and viruses and use it against another planet or another colony on the same planet or even in case internal terrorism. Only starships with near light speed maybe useful as escape mechanism as they could help spread civilization through Galaxy and create many independent nodes.

Our best option to prevent x-risks are international control systems on dangerous tech and lately friendly AI, and we need to do it now, and space colonies have remote and marginal utility.

Comment author: ChristianKl 22 January 2016 10:03:28AM 0 points [-]

But I also think that his claims that we have 100 years (with small probability of extinction)

His claim is that we have 100 years in with we have to be extra careful to prevent Xrisk.

The same maybe said about most other risks - we could create new flu virus even now without any new technologies.

With today's technology you could create a problematic new virus. On the other hand that hardly would mean extinction. Wearing masks 24/7 to filter air isn't fun but it's a possible step when we are afraid of airbone viruses.

Our best option to prevent x-risks are international control systems on dangerous tech and lately friendly AI, and we need to do it now, and space colonies have remote and marginal utility.

It's not like Hawkings doesn't call for AGI control.