I've seen Newcomb and Dust specks vs Torture but not Trolley (although I've seen that one before in other places). Which sequences do I need to finish for those?
If the trolley one is the same as the "standard" version, then it's fairly trivial within the framework of Orthodox Judaism (if I'm allowed to bring that in), because of strict rules about death. I'll elaborate further when I'm up to the question. The other two are a lot more complicated for me.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Two replies:
1) Even if hedonistic utilitarianism would ultimately be wrong as a full description of what a person values, "maximize pleasure while minimizing suffering" can still be a useful heuristic to follow. Yes, following that heuristic to its logical conclusion would mean forcibly rewiring everyone's brains, but that doesn't need to be a problem for as long as forcibly rewiring people's brains isn't a realistic option. HU may still be the best approximation of a person's values in the context of today's world, even if it wasn't the best description overall.
2) The arguments on complexity of value and so on establish that the average person's values aren't correctly described by HU. This still leaves open the possibility of someone only approving of those of their behaviors that serve to promote HU, so there may definitely be individual people who accept HU, due to not sharing the moral intuitions which motivate the objections to it.
On 1): I am skeptical of replies to the effect that "yes, well, X might not be quite right, but it's a useful heuristic, therefore I will go on acting as if X is right". For one thing, a person who makes such a reply usually goes right back to saying "X is right!" (sans qualifiers) as soon as the current conversation ends. Let's get clear on what we actually believe, I generally think; once we've firmly established that, we can look for maximally effective implementations.
For another thing, HU may be the best approximation etc. etc., but that's a claim that at least should be made explicitly, such that it can be examined and argued for; a claim of this importance shouldn't come up only in such tangential discussion branches.
For a third thing, what happens when forcibly rewiring people's brains becomes a realistic option?
On 2): I think there's two issues here. There could indeed be people who accept HU because that's what correctly describes their moral intuitions. (Though I should certainly hope they do not think it proper to impose that moral philosophy on me, or on anyone else who doesn't subscribe to HU!)
"Only approving of those behaviors that serve to promote HU" is, I think, a separate thing. Or at least, I'd need to see the concept expanded a bit more before I could judge. What does this hypothetical person believe? What moral intuitions do they have? What exactly does it mean to "promote" hedonistic utilitarianism?