TimFreeman comments on Inferring Our Desires - Less Wrong

37 Post author: lukeprog 24 May 2011 05:33AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (45)

You are viewing a single comment's thread. Show more comments above.

Comment author: steven0461 24 May 2011 06:37:08PM *  18 points [-]

As such, we'd be unlikely to get what we really want if the world was re-engineered in accordance with a description of what we want that came from verbal introspective access to our motivations.

Interesting as these experimental results are, it sounds to me like you're saying that there's a license to be human (or a license to be yourself, or a license to be your current self).

Suppose I found out that many of my actions that seemed random were actually subtly aimed at invading Moldova, perhaps because aliens with weird preferences placed some functional equivalent of mind control lasers in my brain, and suppose that this fact was not introspectively accessible to me; e.g., a future where Moldova is invaded does not feel more utopian to imagine than the alternatives. Isn't there an important sense in which, in that hypothetical, I don't care about invading Moldova? What if the mind control laser was outside my brain, perhaps in orbit? At what point do I get to say, "I won't let my so-called preferences stop me from doing what's right?"

My impression is that this mindset, where you determine what to do by looking closely at the world to see what you're already doing, and then giving that precedence over what seems right, would be seen as an alien mindset by anyone not affected by certain subtle misunderstandings of the exact sense in which value is subjective. My impression is that once these misunderstandings go away and people ask themselves what considerations they're really moved by, they'll find out that where their utility function (or preferences or whatever) disagrees with what, on reflection, seems right, they genuinely don't care (at least in any straightforward way) what their preferences are, paradoxical as that sounds.

Or am I somehow confused here?

Comment author: TimFreeman 24 May 2011 06:52:02PM *  0 points [-]

Suppose I found out that many of my actions that seemed random were actually subtly aimed at invading Moldova, perhaps because aliens with weird preferences placed some functional equivalent of mind control lasers in my brain

I suspect you'd prefer the aliens turn off their mind-control lasers, and if you had a choice you would have preferred they did not turn on the lasers in the first place.

Once you're corrupted, you're corrupted. At that point we have a mind-controlled Steven wandering around and there's not much point in trying to learn about human motivation from the behavior of humans who are mind-controlled by aliens.

Comment author: steven0461 24 May 2011 07:01:04PM 4 points [-]

So the next question is, what if it's not space aliens, but an alien god?

Comment author: TimFreeman 24 May 2011 08:29:42PM 2 points [-]

what if it's not space aliens, but an alien god [really evolution]?

Well, then its unlikely that your random unconscious actions have any ulterior motive as sophisticated as invading Moldova. Your true desires are probably just some combination of increasing your status, activities prone to make babies, and your conscious desires, assuming the conscious desires haven't been subverted by bad philosophy.

I don't see much harm in activities prone to make babies, so the real question here is "If I my unconscious desires lead me to have poor relationships because I'm gaming them for status, and I don't consciously value status, would I want to fix that by changing the unconscious desires?" I think I would, if I could be sure my income wouldn't be affected much, and the fix was well tested, preferably on other people.

But in any case, human volition is going to look like a clump of mud. It has a more-or-less well defined position, but not exactly, and the boundaries are unclear.