ShardPhoenix comments on A Much Better Life? - Less Wrong

61 Post author: Psychohistorian 03 February 2010 08:01PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (173)

You are viewing a single comment's thread.

Comment author: ShardPhoenix 03 February 2010 10:42:47PM *  6 points [-]

It seems to me that the real problem with this kind of "advanced wireheading" is that while everything may be just great inside the simulation, you're still vulnerable to interference from the outside world (eg the simulation being shut down for political or religious reasons, enemies from the outside world trying to get revenge, relatives trying to communicate with you, etc). I don't think you can just assume this problem away, either (at least not in a psychologically convincing way).

Comment author: nazgulnarsil 04 February 2010 01:57:24PM 4 points [-]

1: Buy Experience Machine 2: Buy nuclear reactor capable of powering said machine for 2x my expected lifetime 3: buy raw materials (nutrients) capable of same 4: launch all out of the solar system at a delta that makes catching me prohibitively energy expensive.

Comment author: ShardPhoenix 05 February 2010 10:36:24AM *  1 point [-]

That was my thought too, but I don't think it's what comes to mind when most people imagine the Matrix. And even then, you might feel (irrational?) guilt about the idea of leaving others behind, so it's not quite a "perfect" scenario.

Comment author: nazgulnarsil 08 February 2010 03:47:21AM -2 points [-]

um...family maybe. otherwise the only subjective experience i care about is my own.

Comment author: Matt_Simpson 04 February 2010 12:57:59AM 10 points [-]

Put yourself in the least convenient possible world. Does your objection still hold water? In other words, the argument is over whether or not we value pure hedonic pleasure, not whether it's a feasible thing to implement.

Comment author: ShardPhoenix 04 February 2010 02:03:06AM *  8 points [-]

It seems the reason why we have the values we do is because we don't live in the least (or in this case most) convenient possible world.

In other words, imagine that you're stuck on some empty planet in the middle of a huge volume of known-life-free space. In this case a pleasant virtual world probably sounds like a much better deal. Even then you still have to worry about asteroids and supernovas and whatnot.

My point is that I'm not convinced that people's objection to wireheading is genuinely because of a fundamental preference for the "real" world (even at enormous hedonic cost), rather than because of inescapable practical concerns and their associated feelings.

edit:

A related question might be, how bad would the real world have to be before you'd prefer the matrix? If you'd prefer to "advanced wirehead" over a lifetime of torture, then clearly you're thinking about cost-benefit trade-offs, not some preference for the real-world that overrides everything else. In that case, a rejection of advanced wireheading may simply reflect a failure to imagine just how good it could be.

Comment author: Psychohistorian 04 February 2010 07:44:15PM 2 points [-]

If you'd prefer to "advanced wirehead" over a lifetime of torture, then clearly you're thinking about cost-benefit trade-offs, not some preference for the real-world that overrides everything else.

Whatever your meta-level goals, unless they are "be tortured for the rest of my life," there's simply no way to accomplish them while being tortured indefinitely. That said, suppose you had some neurological condition that caused you to live in constant excrutiating pain, but otherwise in no way incapacitated you - now, you could still accomplish meta-level goals, but you might still prefer the pain-free simulator. I doubt there's anyone who sincerely places zero value on hedons, but no one ever claimed such people existed.

Comment author: AndyWood 04 February 2010 04:21:47AM *  4 points [-]

People usually seem so intent on thinking up reasons why it might not be so great, that I'm having a really hard time getting a read on what folks think of the core premise.

My life/corner of the world is what I think most people would call very good, but I'd pick the Matrix in a heartbeat. But note that I am taking the Matrix at face value, rather than wondering whether it's a trick of advertising. I can't even begin to imagine myself objecting to a happy, low-stress Matrix.

Comment author: Bugle 04 February 2010 02:44:25PM 5 points [-]

I agree - I think the original post is accurate in what people would respond to the suggestion, in abstract, but the actual implementation would undoubtedly hook vast swathes of the population. We live in a world where people already become addicted to vastly inferior simulations such as WoW already.

Comment author: Shae 04 February 2010 05:31:59PM 1 point [-]

I disagree. I think that even the average long-term tortured prisoner would balk and resist if you walked up to him with this machine. In fact, I think fewer people would accept in real life than those who claim they would, in conversations like these.

The resistance may in fact reveal an inability to properly conceptualize the machine working, or it may not. As others have said, maybe you don't want to do something you think is wrong (like abandoning your relatives or being unproductive) even if later you're guaranteed to forget all about it and live in bliss. What if the machine ran on tortured animals? Or tortured humans that you don't know? That shouldn't bother you any more than if it didn't, if all that matters is how you feel once you're hooked up.

We have some present-day corrolaries. What about a lobotomy, or suicide? Even if these can be shown to be a guaranteed escape from unhappiness or neuroses, most people aren't interested, including some really unhappy people.

Comment author: MugaSofer 22 January 2013 10:57:39AM 0 points [-]

I think that even the average long-term tortured prisoner would balk and resist if you walked up to him with this machine.

I think the average long-term tortured prisoner would be desperate for any option that's not "get tortured more", considering that real torture victims will confess to crimes that carry the death penalty if they think this will make the torturer stop. Or, for that matter, crimes that carry the torture penalty, IIRC.

Comment author: byrnema 04 February 2010 02:10:56AM 4 points [-]

Yes, I agree that while not the first objection a person makes, this could be close to the 'true rejection'. Simulated happiness is fine -- unless it isn't really stable and dependable (because it wasn't real) and you're crudely awoken to discover the whole world has gone to pot and you've got a lot of work to do. Then you'll regret having wasted time 'feeling good'.