Mark_Friedenbach comments on In order to greatly reduce X-risk, design self-replicating spacecraft without AGI - Less Wrong

1 Post author: chaosmage 20 September 2014 08:25PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (36)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 21 September 2014 08:20:22AM *  2 points [-]

Building a self-replicating lunar mining & factory complex is one thing. Building a self-replicating machine that is able to operate effectively in any situation it encounters while expanding into the cosmos is another story entirely. Without knowing the environment in which it will operate, it'll have to be able to adapt to circumstances to achieve its replication goal in whatever situation it finds itself in. That's the definition of an AGI.

Comment author: Eniac 07 December 2014 04:39:27AM 1 point [-]

Bacteria perform quite well at expanding into an environment, and they are not intelligent.

Comment author: [deleted] 10 December 2014 03:48:06AM 1 point [-]

I would argue they are, for some level of micro-intelligence, but that's entirely beside the point. A bacteria doesn't know how to create tools or self-modify or purposefully engineer its environment in such a way as to make things more survivable.