Xyrik comments on Estimate the Cost of Immortality - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (115)
That's a very good point, and I hadn't thought of that. This was basically why I made the post. Although I think I was mentioning somewhere that a scenario like this would only actually work if we had some AGI that could reliably judge who needed what resources when, in order to further the overall human endeavor.
Wouldn't the AGI also need the ability to compel obedience to its diktats? Or do you imagine that everyone will do whatever it tells them to do because it must be the best thing to do?