You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

ChristianKl comments on The Great Filter is early, or AI is hard - Less Wrong Discussion

19 Post author: Stuart_Armstrong 29 August 2014 04:17PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (74)

You are viewing a single comment's thread.

Comment author: ChristianKl 29 August 2014 09:21:07PM 1 point [-]

Alternatively the only stable AGI has a morality that doesn't make it behave in a way where it simply colonises the whole universe.

Comment author: Stuart_Armstrong 01 September 2014 06:02:41PM 1 point [-]

Not colonising the universe - many moralities could go with that.

Allowing potential rivals to colonise the universe... that's much rarer.