Jack comments on The Trolley Problem: Dodging moral questions - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (129)
The purpose of thought experiments and other forms of simulation is to teach us to do better in real life. Obviously, no simulation can be perfectly faithful to real life. But if a given simulation is not merely imperfect but actively misleading, such that training in the simulation will make your real performance worse, then rejecting the simulation is a perfectly rational thing to do.
In real life, if you think the greater good requires you to do evil, you are probably wrong. Therefore, given a thought experiment in which the greater good really does require you to do evil, rejecting the thought experiment on the grounds of being worse than useless for training purposes, is a correct answer.
Not at all. That's way too broad a claim and definitely not the case for the trolley problem. The purpose of the trolley problem is to isolate and identify people's moral intuitions.
Well, depending on what you're trying to nail down as "the purpose", that's not true. The purpose of the trolley problem was to serve as an example of the kinds of ridiculous thought experiments conceived of by moral philosophers (via Philippa Foot). But you know, Poe's Law.
I'm sure you've seen this at some point, but for others...
Consider the following case:
Choose the left track, because cancer kills more people than Hitler (assuming the cure would be delayed by at least 10 years, implementing it doesn't cost more than is currently spent on cancer and a few other things).
And what is the purpose of identifying moral intuitions?
Figuring out how to manipulate those intuitions in order to increase sales of Frosted Flakes.
In which case those who neither currently want Frosted Flakes nor want to want them are still best served by not participating.