jimrandomh comments on Metacontrarian Metaethics - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (75)
Actually, considering the possibility that you've misjudged the probability doesn't help with Pascal's Mugging scenarios, because
And while P(judgment was correct) may be small, it won't be astronomically small under ordinary circumstances, which is what it would take to resolve the mugging.
(My preferred resolution is to restrict the class of admissable utility function-predictor pairs to those where probability shrinks faster that utility grows for any parameterizable statement, which is slightly less restrictive than requiring bounded utility functions.)
BTW, you realize we're talking about torture vs. dust spec and not Pascal's mugging here?
I think he's just pointing out that all you have to do is change the scenario slightly and then my objection doesn't work.
Still, I'm a little curious about how someone's ability to state a large number succinctly makes a difference. I mean, suppose the biggest number the mugger knew how to say was 12, and they didn't know about multiplication, exponents, up arrow notation, etc. They just chose 12 because it was the biggest number they could think of or knew how to express (whether they were bluffing totally or were actually going to torture 3^^^3 people). Should I take a mugger more seriously just because they know how to communicate big numbers to me?
The point of stating the large number succinctly is that it overwhelms the small likelihood of the muggers story being true, at least if you have something resembling a Solomonoff prior. Note also that the mugger isn't really necessary for the scenario, he's merely there to supply a hypothesis that you could have come up with on your own.
Good point. I guess the only way to counter these odd scenarios is to point out that everyone's utility function is different, and then the question is simply whether the responder wants to self-modify (or would be happier in the long run doing so) even after hearing some rationalist arguments to clarify their intuitions. The question of self-modification is a little hard to grasp, but at least it avoids all these far-fetched situations.
For the Pascal's mugging problem, I don't think that will help.
Isn't Pascal's mugging just this?
I'd just walk away. Why should I care? If I thought about it for so long that I had some lingering qualms, and I got mugged like that a lot, I'd self-modify just to enjoy the rest of the my life more.
As an aside, I don't think people really care that much about other people dying unless they have some way to connect to it. Someone probably was murdered while you were reading this comment. Is it going to keep you up? On the other hand, people can cry all night about a video game character dying. It's all subjective.
There's a difference between mental distress and action-motivating desire. If I were asked to pay $5 to prevent someone from being murdered with near-certainty, I would. On the other hand, I would not pay $5 more for a video game where a character does not die, though I can't be sure of this self-simulation because I play video games rather infrequently. If I only had $5, I would definitely spend it on the former option.
I do not allow my mental distress to respond to the same things that motivate my actions; intuitively grasping the magnitude of existential risks is impossible and even thinking about a fraction of that tragedy could prevent action, such as by causing depression. However, existential risks still motivate my decisions.
I thought of a way that I could be mugged Pascal-style: If I had to watch even one simulated person being tortured in gratuitous, holodeck-level realistic detail even for a minute if I didn't pay $5, I'd pay. I also wouldn't self-modify to make me not care about seeing simulated humans tortured in such a way, because I'm afraid that would make my interactions with people I know and care about very strange. I wouldn't want to be callous about witnessing people tortured, because I think it would take away part of my enjoyment of life. (And there are ways to amp up such scenarios to me it far worse, like if I were forced to torture to death 100 simulations of the people I most care about in a holodeck in order to save those actual people...that would probably have very bad consequences for me, and self-modifying so that I wouldn't care would just make it worse.)
But let's face it, the vast majority of people are indeed pretty callous about the actual deaths happening today, all the pain experienced by livestock as they're slaughtered, and all the pain felt by chronic pain sufferers. People decry such things loudly, but few of those who aren't directly connected to the victims are losing sleep over such suffering, even though there are actions they could conceivably take to mitigate it. It is uncomfortable to acknowledge, but it seems undeniable.
It's not Pascal's mugging unless it works with ridiculously low probabilities. Would you pay $5 to avoid a 10^-30 chance of watching 3^^^3 people being tortured?
Are you including yourself in "the vast majority of people"? Are you including most of LW? If your utility is bounded, you are probably not vulnerable to Pascal's mugging. If your utility is not bounded, it is irrelevant whether other people act like their utilities are bounded. Note that even egoists can have unbounded utility functions.
It's still way too restrictive though, no? And are there ways you can Dutch book it with deals where probability grows faster (instead of the intuitively-very-common scenario where they always grow at the same rate)?