Eugine_Nier comments on Metacontrarian Metaethics - Less Wrong

2 Post author: Will_Newsome 20 May 2011 05:36AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (75)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eugine_Nier 22 May 2011 05:41:53PM 1 point [-]

BTW, you realize we're talking about torture vs. dust spec and not Pascal's mugging here?

Comment author: Amanojack 22 May 2011 05:49:20PM 0 points [-]

I think he's just pointing out that all you have to do is change the scenario slightly and then my objection doesn't work.

Still, I'm a little curious about how someone's ability to state a large number succinctly makes a difference. I mean, suppose the biggest number the mugger knew how to say was 12, and they didn't know about multiplication, exponents, up arrow notation, etc. They just chose 12 because it was the biggest number they could think of or knew how to express (whether they were bluffing totally or were actually going to torture 3^^^3 people). Should I take a mugger more seriously just because they know how to communicate big numbers to me?

Comment author: Eugine_Nier 22 May 2011 06:43:46PM 1 point [-]

The point of stating the large number succinctly is that it overwhelms the small likelihood of the muggers story being true, at least if you have something resembling a Solomonoff prior. Note also that the mugger isn't really necessary for the scenario, he's merely there to supply a hypothesis that you could have come up with on your own.

Comment author: Amanojack 22 May 2011 08:42:02PM 0 points [-]

Good point. I guess the only way to counter these odd scenarios is to point out that everyone's utility function is different, and then the question is simply whether the responder wants to self-modify (or would be happier in the long run doing so) even after hearing some rationalist arguments to clarify their intuitions. The question of self-modification is a little hard to grasp, but at least it avoids all these far-fetched situations.

Comment author: Eugine_Nier 22 May 2011 09:28:54PM 1 point [-]

For the Pascal's mugging problem, I don't think that will help.

Comment author: Amanojack 22 May 2011 09:48:09PM -2 points [-]

Isn't Pascal's mugging just this?

"Give me five dollars, or I'll use my magic powers from outside the Matrix to run a Turing machine that simulates and kills 3^^^^3 people."

I'd just walk away. Why should I care? If I thought about it for so long that I had some lingering qualms, and I got mugged like that a lot, I'd self-modify just to enjoy the rest of the my life more.

As an aside, I don't think people really care that much about other people dying unless they have some way to connect to it. Someone probably was murdered while you were reading this comment. Is it going to keep you up? On the other hand, people can cry all night about a video game character dying. It's all subjective.

Comment author: endoself 22 May 2011 10:11:30PM 1 point [-]

As an aside, I don't think people really care that much about other people dying unless they have some way to connect to it. Someone probably was murdered while you were reading this comment. Is it going to keep you up? On the other hand, people can cry all night about a video game character dying. It's all subjective.

There's a difference between mental distress and action-motivating desire. If I were asked to pay $5 to prevent someone from being murdered with near-certainty, I would. On the other hand, I would not pay $5 more for a video game where a character does not die, though I can't be sure of this self-simulation because I play video games rather infrequently. If I only had $5, I would definitely spend it on the former option.

I do not allow my mental distress to respond to the same things that motivate my actions; intuitively grasping the magnitude of existential risks is impossible and even thinking about a fraction of that tragedy could prevent action, such as by causing depression. However, existential risks still motivate my decisions.

Comment author: Amanojack 23 May 2011 03:55:58PM *  0 points [-]

I thought of a way that I could be mugged Pascal-style: If I had to watch even one simulated person being tortured in gratuitous, holodeck-level realistic detail even for a minute if I didn't pay $5, I'd pay. I also wouldn't self-modify to make me not care about seeing simulated humans tortured in such a way, because I'm afraid that would make my interactions with people I know and care about very strange. I wouldn't want to be callous about witnessing people tortured, because I think it would take away part of my enjoyment of life. (And there are ways to amp up such scenarios to me it far worse, like if I were forced to torture to death 100 simulations of the people I most care about in a holodeck in order to save those actual people...that would probably have very bad consequences for me, and self-modifying so that I wouldn't care would just make it worse.)

But let's face it, the vast majority of people are indeed pretty callous about the actual deaths happening today, all the pain experienced by livestock as they're slaughtered, and all the pain felt by chronic pain sufferers. People decry such things loudly, but few of those who aren't directly connected to the victims are losing sleep over such suffering, even though there are actions they could conceivably take to mitigate it. It is uncomfortable to acknowledge, but it seems undeniable.

Comment author: endoself 24 May 2011 03:12:20AM 0 points [-]

I thought of a way that I could be mugged Pascal-style

It's not Pascal's mugging unless it works with ridiculously low probabilities. Would you pay $5 to avoid a 10^-30 chance of watching 3^^^3 people being tortured?

People decry such things loudly, but few of those who aren't directly connected to the victims are losing sleep over such suffering, even though there are actions they could conceivably take to mitigate it. It is uncomfortable to acknowledge, but it seems undeniable.

Are you including yourself in "the vast majority of people"? Are you including most of LW? If your utility is bounded, you are probably not vulnerable to Pascal's mugging. If your utility is not bounded, it is irrelevant whether other people act like their utilities are bounded. Note that even egoists can have unbounded utility functions.

Comment author: Amanojack 25 May 2011 08:18:32AM 1 point [-]

Are you losing sleep over the daily deaths in Iraq? Are most LWers? That's all I'm saying. I consider myself pretty far above-average empathy-wise, to the extent that if I saw someone be tortured and die I'd probably be completely changed as a person. If I spent more time thinking about the war I probably not be able to sleep at all, and eventually if I steeped myself in the reality of the situation I'd probably go insane or die of grief. The same would probably happen if I spent all my time watching slaughterhouse videos. So I'm not pretending to be callous. I'm just trying to inject some reality into the discussion. If we cared as much as we signal we do, no one would be able go to work, or post on LW. We'd all be too grief-stricken.

So although it depends on what exactly you mean by "unbounded utility function," it seems that no one's utility function is really unbounded. And it also isn't immediately clear that anyone would really want their utility function to be unbounded (unless I'm misinterpreting the term).

Also, point taken about my scenario not being a Pascal's mugging situation.