Stuart_Armstrong comments on Anthropic decision theory I: Sleeping beauty and selflessness - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (24)
No sure what you mean by parametric invariances; can you elaborate?
I'll rephrase and try to clarify.
What is the preference function of a selfish person supposed to be independent of? What things can change that won't change the value of his preference function?
Concepts of selfishness often seem muddled to me. They seem to imply a concern confined to a millimeter bubble about your body. Well, who is like that? So I'm asking what do you suppose does and doesn't effect a selfish person's preference function.
In practice, everyone's motivation is a mixture of all sorts of stuff, and very little is even a utility function...
But in theory, this is how I would define a selfish utility: one that is defined entirely in terms of an index "me". If you have two people with exactly the same selfish utility function, completely identical (except that the "me" is different), then those two utilities are independent of each other.