RobbBB comments on David Chalmers' "The Singularity: A Philosophical Analysis" - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (202)
No, I don't think so. The possibility of p-zombies is very important for FAI, because if zombies are possible it seems likely that an FAI could never tell sentient beings apart from non-sentient ones. And if our values all center around promoting positive experiential states for sentient beings, and we are indifferent to the 'welfare' of insentient ones, then a failure to resolve the Hard Problem places a serious constraint on our ability to create a being that can accurately identify the things we value in practice, or on our own ability to determine which AIs or 'uploaded minds' are loci of value (i.e., are sentient).