Dagon comments on Suffering - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (92)
Most people do this intuitionally, and most people keep to make rationalisations of their intuitive judgements or construct neat logical moral theories in order to support them (and these theories usually fail to describe what they are intended to describe, because of their simplicity relative to the complexity of an average man's value system).
That said, for me an agent is the more morally significant the more is it similar to human, and I determine suffering by comparison with my own experiences and some necessary extrapolation. Not much useful answer perhaps, but I don't know of any better.
more morally significant the more is it similar to human
I'd expand this to "the more I empathize with it". Often, I feel more strongly about the suffering of some felines than some humans.
Of course, that's just a description, not a recommendation. The question of "what entities should one empathize with" remains difficult. Most answers which are self-consistent and match observed behaviors are pretty divergent from the signaling (including self-signaling) that you'd like to give out.
Of course it's a description. I understood the original post as asking for description as much as recommendation.
The question "what entities should one empathize with" is as difficult as many similar questions about morality, since it's not absolutely clear what "should" means here. If your values form a system which can derive the answer, do it; but one can hardly expect wide consensus. My recommendation is: you don't need the answer, instead use your own intuition. I think the chances that our intuitions overlap significantly are higher than chances of discovering an answer satisfactory for all.