This just puts me off being utilitarian to be honest.
Understandably so, because the outside view says that most such sacrifices for the greater good end up having been the result of bad epistemology and unrealistic assessments of the costs and benefits.
Strong rationality means that you'd be able to get away with such an act. But strong rationality also means that you generally have better methods of achieving your goals than dubious plans involving sacrifice. When you end thinking you have to do something intuitively morally objectionable 'for the greater good' then you should have tons of alarm bells going off in your head screaming out 'have you really paid attention to the outside view here?!'.
In philosophical problems, you might still have a dilemma. But in real life, such tradeoffs just don't come up on an individual level where you have to actually do the deed. Some stock traders might be actively profiting by screwing everyone over, but they don't have to do anything that would feel wrong in the EEA. The kinds of objections you hear against consequentialism are always about actions that feel wrong. Why not a more realistic example that doesn't directly feed off likely misplaced intuitions?
Imagine you're a big time banker whose firm is making tons of money off of questionably legal mortgage loans that you know will blow up in the economy's face, but you're donating all your money to a prestigious cancer research institute. You've done a very thorough analysis of the relative literature and talked to many high status doctors, and they say that with a couple billion dollars a cure to cancer is in sight. You know that when the economy blows up it will lead to lots of jobless folk without the ability to remortgage their homes. Which is sad, and you can picture all those homeless middle class people and their kids, depressed and alone, all because of you. But cancer is a huge bad ugly cause of death, and you can also picture all of those people that wouldn't have to go through dialysis and painful treatments only to die painfully anyway. Do you do the typically immoral and questionably illegal thing for the greater good?
Why isn't the above dilemma nearly as forceful an argument against consequentialism? Is it because it doesn't appeal in the same way to your evolutionarily adapted sense of justice? Then that might be evidence that your evolutionarily adapted sense of justice wasn't meant for rational moral judgement.
Which is sad, and you can picture all those homeless middle class people and their kids, depressed and alone, all because of you.
And go on to commit suicide due to losing status, not be able to afford health insurance or die from lack of heating... Sure not all of them, but some of them would. Also cancer patients that relied on savings to pay for care might be affected in the time lag between crash and cure being created.
I'd also have weigh how quickly the billions could be raised without tanking the economy. And how many people the time difference in when it was developed would save.
So I am still stuck doing moral calculus, with death on my hands whatever I chose.
Related: Taking ideas seriously
Let us say hypothetically you care about stopping people smoking.
You were going to donate $1000 dollars to givewell to save a life, instead you learn about an anti-tobacco campaign that is better. So you chose to donate $1000 dollars to a campaign to stop people smoking instead of donating it to a givewell charity to save an African's life. You justify this by expecting more people to live due to having stopped smoking (this probably isn't true, but for the sake of argument)
The consequences of donating to the anti-smoking campaign is that 1 person dies in africa and 20 live that would have died instead live all over the world.
Now you also have the choice of setting fire to many tobacco plantations, you estimate that the increased cost of cigarettes would save 20 lives but it will kill likely 1 guard worker. You are very intelligent so you think you can get away with it. There are no consequences to this action. You don't care much about the scorched earth or loss of profits.
If there are causes with payoff matrices like this, then it seems like a real world instance of the trolley problem. We are willing to cause loss of life due to inaction to achieve our goals but not cause loss of life due to action.
What should you do?
Killing someone is generally wrong but you are causing the death of someone in both cases. You either need to justify that leaving someone to die is ethically not the same as killing someone, or inure yourself that when you chose to spend $1000 dollars in a way that doesn't save a life, you are killing. Or ignore the whole thing.
This just puts me off being utilitarian to be honest.
Edit: To clarify, I am an easy going person, I don't like making life and death decisions. I would rather live and laugh, without worrying about things too much.
This confluence of ideas made me realise that we are making life and death decisions every time we spend $1000 dollars. I'm not sure where I will go from here.