William_S comments on Superintelligence 19: Post-transition formation of a singleton - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (35)
If you had copies, how altruistic do you think you would be toward them?
I would like to think that I would cooperate reasonably with my copies, especially when there is a strong reason to prioritize global values over selfish values.
However, in practice I would also expect that System 1 would still see copies as separate but related individuals rather than as myself, and this would limit the amount of cooperation that occurs. I might have to engage in some self-deceptive reasoning to accomplish selfishness, but the human brain is good at that ("I've been working harder than my copies - I deserve a little extra!")