ChristianKl comments on [Link] Values Spreading is Often More Important than Extinction Risk - Less Wrong

11 Post author: Pablo_Stafforini 07 April 2013 05:14AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (21)

You are viewing a single comment's thread. Show more comments above.

Comment author: ChristianKl 10 April 2013 03:17:27PM -1 points [-]

Maybe it could ally with some people and give them tech/power in exchange for carrying out its bidding.

Some AI's already do this today. The outsource work they can't do to Amazon's mechanical turk where humans get payed money to do tasks for the AI.

Other humans take on job on rentacoder where they never see the human that's hiring them.

Even with what you describe, humans wouldn't become extinct, barring other outcomes like really bad nuclear war or whatever.

Human's wouldn't get extinct in a short time frame but if the AGI has decades of time than it can increase it's own power over time and decrease it's dependence on humans. Sooner or later the humans wouldn't be useful for the AGI anymore and then go extinct.