Raoul589 comments on Welcome to Heaven - Less Wrong

23 Post author: denisbider 25 January 2010 11:22PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (242)

You are viewing a single comment's thread. Show more comments above.

Comment author: Raoul589 27 January 2013 01:46:25AM 2 points [-]

It seems, then, that anti-wireheading boils down to the claim that 'wireheading, boo!'.

This is not a convincing argument to people whose brains don't say to them 'wireheading, boo!'. My impression was that denisbider's top level post was a call for an anti-wireheading argument more convincing than this.

Comment author: lavalamp 27 January 2013 04:14:30PM 1 point [-]

I use my current value system to evaluate possible futures. The current me really doesn't like the possible future me sitting stationary in the corner of a room doing nothing, even though that version of me is experiencing lots of happiness.

I guess I view wireheading as equivalent to suicide; you're entering a state in which you'll no longer affect the rest of the world, and from which you'll never emerge.

No arguments will work on someone who's already wireheaded, but for someone who is considering it, hopefully they'll consider the negative effects on the rest of society. Your friends will miss you, you'll be a resource drain, etc. We already have an imperfect wireheading option; we call it drug addiction.

If none of that moves you, then perhaps you should wirehead.

Comment author: TheOtherDave 27 January 2013 04:29:52PM 4 points [-]

Is the social-good argument your true rejection, here?

Does it follow from this that if you concluded, after careful analysis, that you sitting stationary in a corner of a room experiencing various desirable experiences would be a net positive to the rest of society (your friends will be happy for you, you'll consume fewer net resources than if you were moving around, eating food, burning fossil fuels to get places, etc., etc.), then you would reluctantly choose to wirehead, and endorse others for whom the same were true to do so?

Or is the social good argument just a soldier here?

Comment author: lavalamp 27 January 2013 06:28:24PM 5 points [-]

After some thought, I believe that the social good argument, if it somehow came out the other way, would in fact move me to reluctantly change my mind. (Your example arguments didn't do the trick, though-- to get my brain to imagine an argument that would move me, I had to imagine a world where my continued interaction with other humans in fact harms them in ways I cannot do something to avoid; something like I'm an evil person, don't wish to be evil, but it's not possible to cease being evil are all true.) I'd still at least want a minecraft version of wireheading and not a drugged out version, I think.

Comment author: TheOtherDave 27 January 2013 07:16:35PM 2 points [-]

Cool.

Comment author: Raoul589 28 January 2013 01:19:14AM 0 points [-]

You will only wirehead if that will prevent you from doing active, intentional harm to others. Why is your standard so high? TheOtherDave's speculative scenario should be sufficient to have you support wireheading, if your argument against it is social good - since in his scenario it is clearly net better to wirehead than not to.

Comment author: lavalamp 28 January 2013 01:34:52AM 0 points [-]

All of the things he lists are not true for me personally and I had trouble imagining worlds in which they were true of me or anyone else. (Exception being the resource argument-- I imagine e.g. welfare recipients would consume fewer resources but anyone gainfully employed AFAIK generally adds more value to the economy than they remove.)

Comment author: TheOtherDave 28 January 2013 05:51:44AM 0 points [-]

FWIW, I don't find it hard to imagine a world where automated tools that require fewer resources to maintain than I do are at least as good as I am at doing any job I can do.

Comment author: lavalamp 28 January 2013 01:29:53PM 0 points [-]

Ah, see, for me that sort of world has human level machine intelligence, which makes it really hard to make predictions about.

Comment author: TheOtherDave 28 January 2013 03:30:45PM 0 points [-]

Yes, agreed that automated tools with human-level intelligence are implicit in the scenario.
I'm not quite sure what "predictions" you have in mind, though.

Comment author: lavalamp 28 January 2013 07:35:04PM 0 points [-]

That was poorly phrased, sorry. I meant it's difficult to reason about in general. Like, I expect futures with human-level machine intelligences to be really unstable and either turn into FAI heaven or uFAI hell rapidly. I also expect them to not be particularly resource constrained, such that the marginal effects of one human wireheading would be pretty much nil. But I hold all beliefs about this sort of future with very low confidence.

Comment author: ArisKatsaris 27 January 2013 06:21:17PM 2 points [-]

We don't need to be motivated by a single purpose. The part of our brains that is morality and considers what is good for the rest of the word, the part of our brains that just finds it aesthetically displeasing to be wireheaded for whatever reason, the part of our brains that just seeks pleasure, they may all have different votes of different weights to cast.

Comment author: Kawoomba 27 January 2013 07:06:11PM 1 point [-]

I against my brother, my brothers and I against my cousins, then my cousins and I against strangers.

Which bracket do I identify with at the point in time when being asked the question? Which perspective do I take? That's what determines the purpose. You might say - well, your own perspective. But that's the thing, my perspective depends on - other than the time of day and my current hormonal status - the way the question is framed, and which identity level I identify with most at that moment.

Comment author: Raoul589 28 January 2013 01:21:28AM 0 points [-]

Does it follow from that that you could consider taking the perspective of your post wirehead self?

Comment author: Kawoomba 28 January 2013 07:00:31AM 0 points [-]

Consider in the sense of "what would my wire headed self do", yes. Similar to Anja's recent post. However, I'll never (can't imagine the circumstances) be in a state of mind where doing so would seem natural to me.

Comment author: TheOtherDave 27 January 2013 07:20:27PM 0 points [-]

Yes. But insofar as that's true, lavalamp's idea that Raoul589 should wirehead if the social-good argument doesn't move them is less clear.