You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Ben_LandauTaylor comments on The Argument From Marginal Cases - Less Wrong Discussion

15 Post author: jkaufman 26 July 2013 01:30PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (55)

You are viewing a single comment's thread. Show more comments above.

Comment author: Ben_LandauTaylor 26 July 2013 07:21:07PM 4 points [-]

In general, I think trying to weigh secular values against sacred values is a recipe for reducing the amount you care about the former.

If I understand the sacred/secular terminology correctly, then this seems like a feature, not a bug.

Comment author: Mestroyer 27 July 2013 03:02:55AM *  4 points [-]

It could be a feature if the secular value is a big house or something, and the sacred value is whatever you might donate to an effective charity for.

It's definitely not a feature if the sacred value is sacred to the individual because when they imagine compromising it, they imagine scorn from their peers, and if society has handed it down as a sacred value of that sort since antiquity.

Also, not all values that people will treat as lexically lower than their most sacred values (In reality, there are probably more than 2 tiers of values, of course) are things you would probably want to get rid of. Most of fun theory is probably much lower on the hierarchy of things that cannot be traded away than human life, and yet you still want concern for it to have a significant role in shaping the future.

And then there's taboo tradeoffs between a certain amount of a thing, and a smaller amount of the same thing, and following the kind of thought process I warned against leads you to into the territory of clear madness like choosing specks over torture no matter the number of specks.

A more troubling counterargument to what I said is that no matter what you do you are living an answer to the question, so you can't just ignore it. This is true if you are an effective altruist (who has already rejected working on existential risk), and trying to decide whether to focus on helping humans or helping animals. Then you really need to do a utilitarian calculus that requires that number.

If I needed that number, I would first try to spend some time around the relevant sort of animal (or the closest approximation I could get), and try to gather as much information as possible about what it was cognitively capable of, hang out with some animal-loving hippies to counteract social pressure in favor of valuing humans infinitely more than animals. Then I might try and figure things out indirectly, through separate comparisons to a third variable (Perhaps my own time? I don't think I feel any hard taboo tradeoffs at work when I think about how much time I'd spend to help animals or how much I'd spend to help humans, though maybe I've just worn out the taboo by thinking about trading my time for lives as much as I have (Edit: that sounded a lot more evil than is accurate. To clarify, killing people to save time does sound horrifying to me, but not bothering to save distant strangers doesn't)).