byrnema comments on Advice for AI makers - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (196)
I was considering the least convenient argument, the one that I imagined would result in the least aggressive AI. (I should explain here that I considered that even a 0 terminal utility for the resource itself would not result in 0 utility for that resource, because that resource would have some instrumental value in achieving things of value.)
(Above edited because I don't think I was understood.)
But I think the problem in logic identified with inputting the value of an instrumental value remains either way.
You pretty much have to guess about the marginal value of resources. But let's say the AI's utility function is "10^10th root of # of paperclips in universe." Then it probably satisfies the criterion.
EDIT: even better would be U = 1 if the universe contains at least one paperclip, otherwise 0.