You all know the rules:
- Please post all quotes separately, so that they can be voted up/down separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- Do not quote yourself.
- Do not quote comments/posts on LW/OB.
- No more than 5 quotes per person per monthly thread, please.
I like Constant's reply, but it's also worth emphasizing that we can't solve scientific problems by interrogating our moral intuitions. The categories we instinctively sort things into are not perfectly aligned with reality.
Suppose we'd evolved in an environment with sophisticated 2011-era artificially intelligent Turing-computable robots--ones that could communicate their needs to humans, remember and reward those who cooperated, and attack those who betrayed them. I think it's likely we'd evolve to instinctively think of them as made of different stuff than anything we could possibly make ourselves, because that would be true for millions of years. We'd evolve to feel moral obligations toward them, to a point, because that would be evolutionarily advantageous, to a point. Once we developed philosophy, we might take this moral feeling as evidence that they're not Turing-computable--after all, we don't have any moral obligations to a mere mass of tape.