3^^^3 dust specks in everybody's eye?
So basically we're talking about turning all sentient life into black holes, or torturing everybody?
I mean, it depends on how good the torture we're talking about is, and how long it will last. If it's permanent and unchanging, eventually people will get used to it/evolve past it and move on. If it's short-term, eventually people will get past it. So in either of those cases, torture is the obvious choice.
If, on the other hand, it's permanent and adaptive such that all life is completely and totally miserable for perpetuity, and there is nothing remotely good about living, oblivion seems the obvious choice.
This seems like a weird mishmash of other hypotheticals on the site, I'm not really seeing the point of parts of your scenario.
Well I personally don't want to be tortured, so I choose the dust speck.
Even if I wasn't personally involved, and I was to decide on morality alone rather than personal interest, average utilitarianism tells me that I should choose the dust speck. (Better that 100% of all people suffer from a dust speck, than 100% of all people suffer from torture)
This doesn't seem very coherent.
As it happens, a perfect and truthful predictor has declared that you will choose torture iff you are alone.
OK. Then that means if I choose torture, I am alone. If I choose the dust specks, I am not alone. I don't want to be tortured, and don't really care about 3 ^^^ 3 people getting dust specks in their eyes, even if they're all 'perfect copies of me'. I am not a perfect utilitarian.
A perfect utilitarian would choose torture though, because one person getting tortured is technically not as bad from a utilitarian point of view as 3 ^^^ 3 dust specks in eyes.
The way the problem reads to me, choosing dust specks means I live in a universe where 3^^^3 of me exist, and choosing torture means 1 of me exist. I prefer that more of myself exist than not, so I should choose specks in this case.
In a choice between "torture for everyone in the universe" and "specks for everyone in the universe", the negative utility of the former obviously outweighs that of the latter, so I should choose specks.
I don't see any incongruity or reason to question my beliefs? I suppose it's meant to be implied that it's ...
For the case that dust specks aren't additive, assuming we treat copies of me as distinct entities with distinct moral weight, 3^^^3 copies of me is either a net negative - as a result of 3^^^3 lives not worth living - or a net positive - as a result of an additional 3^^^3 lives worth living. The point of the dust speck is that it has only a negligible effect; the weight of the dust speck moral issue is completely subsumed by the weight of the duplicate people issue.
If we don't treat them as distinct moral entities, well, the duplication and the dust spec...
It makes a huge difference whether the dust speck choices add up or not. If they do, OrphanWilde's objection applies and the only path to survival is to be tortured.
If they don't, so each one of me gets one dust speck total, then dust specks for sure. All of the copies of me (whether there are one or 3^^^3 of us) are experiencing what amounts to a choice between individually being dust-specked or individually being tortured. We get what we ask for either way, and no one else is actually impacted by the choice.
There's no need to drag average utilitarianism in.
I choose torture if and only if I'm alone. Otherwise the predictor would be wrong, contrary to the assumptions of the hypothetical. But I'd rather be in the world where dust specks gets chosen.
IMO since people are patterns (and not instances of patterns), there's still only one person in the universe regardless of how many perfect copies of me there are. So I choose dust specks. Looks like the predictor isn't so perfect. :P
What's stopping you from creating a sub agent (informing a friend) with the task of convincing you not to drink AFTER you've gotten the money? ...
Like Odysseus with the Sirens, you'd have to "create a subagent"/hire a friend to convince you not to drink, before you intend to drink it, then actually change your intentions and want to drink it.
This doesn't seem possible for a human mind, though of course it's easier to imagine for artificial minds that can be edited at will.
How is it not possible? When force is allowed, the hired people could simply physically restrain me - I'd fight them with tooth and nail, their vastly superior training would have me on the floor within a minute, after which I'd be kept separate from the vial of toxin for the remainder of the day. ... Although I guess "separation for a period of time"-based arguments rely on you both being obsessive AND pedantic enough to not care about it on the next day. Being really passionate about something and then dropping the issue the next day because th...
You're given the option to torture everyone in the universe, or inflict a dust speck on everyone in the universe. Either you are the only one in the universe, or there are 3^^^3 perfect copies of you (far enough apart that you will never meet.) In the latter case, all copies of you are chosen, and all make the same choice. (Edit: if they choose specks, each person gets one dust speck. This was not meant to be ambiguous.)
As it happens, a perfect and truthful predictor has declared that you will choose torture iff you are alone.
What do you do?
How does your answer change if the predictor made the copies of you conditional on their prediction?
How does your answer change if, in addition to that, you're told you are the original?