TimFreeman comments on SIAI - An Examination - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (203)
What do you mean by existential risk, then? I thought things that killed all humans were, by definition, existential risks.
What, if anything, do you value that you expect to exist in the long term?
Pretty compelling arguments, IMO. It's simple -- the vast majority of goals can be achieved more easily if one has more resources, and humans control resources, so an entity that is able to self-improve will tend to seize control of all the resources and therefore take control of those resources from the humans.
Do you have a counterargument, or something relevant to the issue that isn't just an argument?