All this AI stuff is an unnecessary distraction. Why not bomb cigarette factories? If you're willing to tell people to stop smoking, you should be willing to kill a tobacco company executive if it will reduce lung cancer by the same amount, right?
This decision algorithm ("kill anyone whom I think needs killing") leads to general anarchy. There are a lot of people around who believe for one reason or another that killing various people would make things better, and most of them are wrong, for example religious fundamentalists who think killing gay people will improve society.
There are three possible equilibria - the one in which everyone kills everyone else, the one in which no one kills anyone else, and the one where everyone comes together to come up with a decision procedure to decide whom to kill - ie establishes an institution with a monopoly on using force. This third one is generally better than the other two which is why we have government and why most of us are usually willing to follow its laws.
I can conceive of extreme cases where it might be worth defecting from the equilibrium because the alternative is even worse - but bombing Intel? Come on. "A guy bombed a chip factory, guess we'll never pursue advanced computer technology again until we have the wisdom to use it."
All this AI stuff is an unnecessary distraction.
In a way yes. It was just the context that I thought of the problem under.
. Why not bomb cigarette factories? If you're willing to tell people to stop smoking, you should be willing to kill a tobacco company executive if it will reduce lung cancer by the same amount, right?
Not quite. If you are willing to donate $1000 dollars to an ad campaign against stopping smoking, because you think the ad campaign will save more than 1 life then yes it might be equivalent. If killing that executive would have a co...
Related: Taking ideas seriously
Let us say hypothetically you care about stopping people smoking.
You were going to donate $1000 dollars to givewell to save a life, instead you learn about an anti-tobacco campaign that is better. So you chose to donate $1000 dollars to a campaign to stop people smoking instead of donating it to a givewell charity to save an African's life. You justify this by expecting more people to live due to having stopped smoking (this probably isn't true, but for the sake of argument)
The consequences of donating to the anti-smoking campaign is that 1 person dies in africa and 20 live that would have died instead live all over the world.
Now you also have the choice of setting fire to many tobacco plantations, you estimate that the increased cost of cigarettes would save 20 lives but it will kill likely 1 guard worker. You are very intelligent so you think you can get away with it. There are no consequences to this action. You don't care much about the scorched earth or loss of profits.
If there are causes with payoff matrices like this, then it seems like a real world instance of the trolley problem. We are willing to cause loss of life due to inaction to achieve our goals but not cause loss of life due to action.
What should you do?
Killing someone is generally wrong but you are causing the death of someone in both cases. You either need to justify that leaving someone to die is ethically not the same as killing someone, or inure yourself that when you chose to spend $1000 dollars in a way that doesn't save a life, you are killing. Or ignore the whole thing.
This just puts me off being utilitarian to be honest.
Edit: To clarify, I am an easy going person, I don't like making life and death decisions. I would rather live and laugh, without worrying about things too much.
This confluence of ideas made me realise that we are making life and death decisions every time we spend $1000 dollars. I'm not sure where I will go from here.