3^^^3 dust specks in everybody's eye?
So basically we're talking about turning all sentient life into black holes, or torturing everybody?
I mean, it depends on how good the torture we're talking about is, and how long it will last. If it's permanent and unchanging, eventually people will get used to it/evolve past it and move on. If it's short-term, eventually people will get past it. So in either of those cases, torture is the obvious choice.
If, on the other hand, it's permanent and adaptive such that all life is completely and totally miserable for perpetuity, and there is nothing remotely good about living, oblivion seems the obvious choice.
This seems like a weird mishmash of other hypotheticals on the site, I'm not really seeing the point of parts of your scenario.
Well I personally don't want to be tortured, so I choose the dust speck.
Even if I wasn't personally involved, and I was to decide on morality alone rather than personal interest, average utilitarianism tells me that I should choose the dust speck. (Better that 100% of all people suffer from a dust speck, than 100% of all people suffer from torture)
This doesn't seem very coherent.
As it happens, a perfect and truthful predictor has declared that you will choose torture iff you are alone.
OK. Then that means if I choose torture, I am alone. If I choose the dust specks, I am not alone. I don't want to be tortured, and don't really care about 3 ^^^ 3 people getting dust specks in their eyes, even if they're all 'perfect copies of me'. I am not a perfect utilitarian.
A perfect utilitarian would choose torture though, because one person getting tortured is technically not as bad from a utilitarian point of view as 3 ^^^ 3 dust specks in eyes.
The way the problem reads to me, choosing dust specks means I live in a universe where 3^^^3 of me exist, and choosing torture means 1 of me exist. I prefer that more of myself exist than not, so I should choose specks in this case.
In a choice between "torture for everyone in the universe" and "specks for everyone in the universe", the negative utility of the former obviously outweighs that of the latter, so I should choose specks.
I don't see any incongruity or reason to question my beliefs? I suppose it's meant to be implied that it's ...
For the case that dust specks aren't additive, assuming we treat copies of me as distinct entities with distinct moral weight, 3^^^3 copies of me is either a net negative - as a result of 3^^^3 lives not worth living - or a net positive - as a result of an additional 3^^^3 lives worth living. The point of the dust speck is that it has only a negligible effect; the weight of the dust speck moral issue is completely subsumed by the weight of the duplicate people issue.
If we don't treat them as distinct moral entities, well, the duplication and the dust spec...
It makes a huge difference whether the dust speck choices add up or not. If they do, OrphanWilde's objection applies and the only path to survival is to be tortured.
If they don't, so each one of me gets one dust speck total, then dust specks for sure. All of the copies of me (whether there are one or 3^^^3 of us) are experiencing what amounts to a choice between individually being dust-specked or individually being tortured. We get what we ask for either way, and no one else is actually impacted by the choice.
There's no need to drag average utilitarianism in.
I choose torture if and only if I'm alone. Otherwise the predictor would be wrong, contrary to the assumptions of the hypothetical. But I'd rather be in the world where dust specks gets chosen.
IMO since people are patterns (and not instances of patterns), there's still only one person in the universe regardless of how many perfect copies of me there are. So I choose dust specks. Looks like the predictor isn't so perfect. :P
How is it not possible? When force is allowed, the hired people could simply physically restrain me - I'd fight them with tooth and nail, their vastly superior training would have me on the floor within a minute, after which I'd be kept separate from the vial of toxin for the remainder of the day. ... Although I guess "separation for a period of time"-based arguments rely on you both being obsessive AND pedantic enough to not care about it on the next day. Being really passionate about something and then dropping the issue the next day because the window of opportunity has been closed is ... unlikely to occur, so my solution might end up making me rich but leaving me in the looney-bin.
I think a better argument against my ideas is logistics - how could I acquire everything I need in a span of (at most) 23 hours? (The wording is a such that at tonight as the day turns, you must intend to take the poison). A middle class worker generally doesn't have ties to any mercenaries, and payment isn't given until the morning after your intent has to be made.
I get your point, though - convincing someone to later convince you already carries massive penalties ("Why are you acting so weird?"), the situation carries massive penalties ("And you believe this guy?", "For HOW MUCH?!")...
My argument basically rests on turning the whole thing into a game: "Design a puzzle you cannot get out of. Then, a few minutes before midnight (to be safe), start doing your utmost best to break this puzzle."
How is it not possible? When force is allowed, the hired people
Why would you hire people to stop you from drinking it, if you intend to drink it, since you know that hiring such people will increase the chances you will end up not drinking it?
I get your point, though - convincing someone to later convince you already carries massive penalties
NO! That's not my point. My point isn't whether it's expensive or difficult to hire someone, My point is that you don't want to hire someone. Because you intend to drink the toxin, and hiring someone to stop you from doing that doesn't match your intention.
You're given the option to torture everyone in the universe, or inflict a dust speck on everyone in the universe. Either you are the only one in the universe, or there are 3^^^3 perfect copies of you (far enough apart that you will never meet.) In the latter case, all copies of you are chosen, and all make the same choice. (Edit: if they choose specks, each person gets one dust speck. This was not meant to be ambiguous.)
As it happens, a perfect and truthful predictor has declared that you will choose torture iff you are alone.
What do you do?
How does your answer change if the predictor made the copies of you conditional on their prediction?
How does your answer change if, in addition to that, you're told you are the original?