3^^^3 dust specks in everybody's eye?
So basically we're talking about turning all sentient life into black holes, or torturing everybody?
I mean, it depends on how good the torture we're talking about is, and how long it will last. If it's permanent and unchanging, eventually people will get used to it/evolve past it and move on. If it's short-term, eventually people will get past it. So in either of those cases, torture is the obvious choice.
If, on the other hand, it's permanent and adaptive such that all life is completely and totally miserable for perpetuity, and there is nothing remotely good about living, oblivion seems the obvious choice.
This seems like a weird mishmash of other hypotheticals on the site, I'm not really seeing the point of parts of your scenario.
Well I personally don't want to be tortured, so I choose the dust speck.
Even if I wasn't personally involved, and I was to decide on morality alone rather than personal interest, average utilitarianism tells me that I should choose the dust speck. (Better that 100% of all people suffer from a dust speck, than 100% of all people suffer from torture)
This doesn't seem very coherent.
As it happens, a perfect and truthful predictor has declared that you will choose torture iff you are alone.
OK. Then that means if I choose torture, I am alone. If I choose the dust specks, I am not alone. I don't want to be tortured, and don't really care about 3 ^^^ 3 people getting dust specks in their eyes, even if they're all 'perfect copies of me'. I am not a perfect utilitarian.
A perfect utilitarian would choose torture though, because one person getting tortured is technically not as bad from a utilitarian point of view as 3 ^^^ 3 dust specks in eyes.
The way the problem reads to me, choosing dust specks means I live in a universe where 3^^^3 of me exist, and choosing torture means 1 of me exist. I prefer that more of myself exist than not, so I should choose specks in this case.
In a choice between "torture for everyone in the universe" and "specks for everyone in the universe", the negative utility of the former obviously outweighs that of the latter, so I should choose specks.
I don't see any incongruity or reason to question my beliefs? I suppose it's meant to be implied that it's ...
For the case that dust specks aren't additive, assuming we treat copies of me as distinct entities with distinct moral weight, 3^^^3 copies of me is either a net negative - as a result of 3^^^3 lives not worth living - or a net positive - as a result of an additional 3^^^3 lives worth living. The point of the dust speck is that it has only a negligible effect; the weight of the dust speck moral issue is completely subsumed by the weight of the duplicate people issue.
If we don't treat them as distinct moral entities, well, the duplication and the dust spec...
It makes a huge difference whether the dust speck choices add up or not. If they do, OrphanWilde's objection applies and the only path to survival is to be tortured.
If they don't, so each one of me gets one dust speck total, then dust specks for sure. All of the copies of me (whether there are one or 3^^^3 of us) are experiencing what amounts to a choice between individually being dust-specked or individually being tortured. We get what we ask for either way, and no one else is actually impacted by the choice.
There's no need to drag average utilitarianism in.
I choose torture if and only if I'm alone. Otherwise the predictor would be wrong, contrary to the assumptions of the hypothetical. But I'd rather be in the world where dust specks gets chosen.
IMO since people are patterns (and not instances of patterns), there's still only one person in the universe regardless of how many perfect copies of me there are. So I choose dust specks. Looks like the predictor isn't so perfect. :P
all these "creative" solutions are not really allowed. Why is that?
Because the point of these questions isn't to challenge you to find a good answer, it's that the process of answering them may lead to insight into your actual value system, understanding of causation, etc. Finding clever ways around the problem is a bit like cheating in an optician's eye test[1]: sure, maybe you can do that, but the result will be that you get less effective eyesight correction and end up worse off.
[1] e.g., maybe you have found a copy of whatever chart they use and memorized the letters on it.
So, e.g., the point of the toxin puzzle is to ask: can you, really, form an intention to do something when you know that when the time comes you will be able to choose and will have no reason to choose to do it and much reason not to? That's an interesting psychological and/or philosophical question. You can avoid answering it by saying "well, I'd find a way to make taking the toxin not actually do me any harm", and that might be an excellent idea if you ever find yourself in that bizarre situation -- but the point of the question isn't to plan for an actual future where you encounter a quirkily sadistic but generous billionaire, it's to help clarify your thinking about what happens when you form an intention to do something.
Of course you may repurpose the question, and then your "clever" answers may be entirely to the point. Suppose you decide that no, you cannot form an intention to do something that you will have good reason to choose not to do; well, situations might arise where it would be useful to do that (even though the precise situation Kavka describes is unlikely), so it's reasonable to think about how you might make it possible, and then some "clever" answers may become relevant. But others probably won't, and the "get drugged into a coma" solution is probably one of those.
(Incidentally, in the original puzzle the amount of money was a million rather than a billion. That's probably still enough to hire someone to drug you into a coma.)
It is indeed a million, woops. Thanks for explaining in detail about the purpose of such questions. I find that I get into "come up with a clever answer" mode faster if the question has losses - not getting money is "meh", a day worth of excruciating pain in exchange for money, well, that needs a workaround!
As for the puzzle itself, I don't know if I can form such an intention... but I seem to be really good at it in real life. I call it procrastinating. I make a commitment that fails to account for time discounting and then I end up going to bed later than I wanted. After dinner I intended to go to bed early; at midnight I wanted to see another episode. So apparently it's possible.
You're given the option to torture everyone in the universe, or inflict a dust speck on everyone in the universe. Either you are the only one in the universe, or there are 3^^^3 perfect copies of you (far enough apart that you will never meet.) In the latter case, all copies of you are chosen, and all make the same choice. (Edit: if they choose specks, each person gets one dust speck. This was not meant to be ambiguous.)
As it happens, a perfect and truthful predictor has declared that you will choose torture iff you are alone.
What do you do?
How does your answer change if the predictor made the copies of you conditional on their prediction?
How does your answer change if, in addition to that, you're told you are the original?