blacktrance comments on AALWA: Ask any LessWronger anything - Less Wrong

28 Post author: Will_Newsome 12 January 2014 02:18AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (611)

You are viewing a single comment's thread. Show more comments above.

Comment author: blacktrance 14 January 2014 02:42:57AM 1 point [-]

My sequence-superseded interests include the nature of free will, self-improvement (in the sense of luminosity, not productivity), and general interest in rational thinking.

Three areas where I disagree with the Sequences:

  • Fake Selfishness. EY mistakenly treats "selfishness" as something like wealth maximization, or at least something that excludes caring about others. Selfishness means acting in one's self-interest. There are three major philosophical views as to what people's interests are: hedonism (pleasure), preference satisfaction, and objective-list (i.e. if a person has the things on this list, their interests are being fulfilled). Wealth maximization is only a plausible manifestation of self-interest for a person with very atypical preferences or for an unusual list. There is no reason why egoism would automatically exclude caring about others - in fact, caring about others often makes people happy, and fulfills their preferences. As for the assumption in the sentence "Shouldn't you be trying to persuade me to be more altruistic, so you can exploit me?", that ignores virtue ethical egoism, as in the Epicurean tradition - that is, exploiting people (in the sense in which exploitation is bad) is not conducive to happiness, and that being honest, just, benevolent, etc, is actually in one's self-interest.

  • Not for the Sake of Happiness Alone. EY fails to apply reductionism to human values. He says, "I care about terminal values X, Y, and Z", but when it comes down to it, people would really like pleasure more than anything else, and the distinction between wanting and liking is irrelevant. To indulge in a bit of psychologizing, I think that trying to depict multiple values as irreducible comes from an aversion to wireheading - because if you conclude that all values reduce to happiness/pleasure, you must also conclude that wireheading is the ideal state. But I don't share this aversion - wireheading is the ideal state.

  • Because of the above, I disagree with basically the entirety of the Fun Theory sequence. It seems to be an attempt to reconcile Transhumanism as Simplified Humanism with not wanting to wirehead, and the two really aren't reconcilable - and Transhumanism as Simplified Humanism is correct.

Comment author: TheOtherDave 14 January 2014 06:26:26PM 0 points [-]

Is there anyone in the world whose well-being you care strongly about?

Comment author: blacktrance 14 January 2014 07:48:23PM *  0 points [-]

Yes, myself and others, though the well-being of others is an instrumental value.

Comment author: TheOtherDave 14 January 2014 08:20:37PM 0 points [-]

To confirm: you're the only person whose well-being you care about "terminally"?

Comment author: blacktrance 14 January 2014 08:28:13PM -1 points [-]

Yes.

Comment author: TheOtherDave 14 January 2014 08:44:34PM 1 point [-]

(nods) OK. Accepting that claim as true, I agree that you should endorse wireheading.

(Also that you should endorse having everyone in the world suffer for the rest of their lives after your death, in exchange for you getting a tuna fish sandwich right now, because hey, a tuna fish sandwich is better than nothing.)

Do you believe that nobody else in the world "terminally" cares about the well-being of others?

Comment author: blacktrance 14 January 2014 08:55:13PM -1 points [-]

you should endorse having everyone in the world suffer for the rest of their lives after your death, in exchange for you getting a tuna fish sandwich right now, because hey, a tuna fish sandwich is better than nothing

No, because I care (instrumentally) about the well-being of others in the future as well, and knowing that they'll be tortured, especially because of me, would reduce my happiness now by significantly more than a tuna sandwich would increase it.

Do you believe that nobody else in the world "terminally" cares about the well-being of others?

That's a difficult question to answer because of the difficulties surrounding what it means for someone to care. People's current values can change in response to introspection or empirical information - and not just instrumental values, but seemingly terminal values as well. This makes me question whether their seemingly terminal values were actually their terminal values to begin with. Certainly, people believe that they terminally care about the well-being of others, and if believing that you care qualifies as actually caring, then yes, they do care. But I don't think that someone who'd experience ideal wireheading would like anything else more.

Comment author: TheOtherDave 14 January 2014 09:12:34PM 0 points [-]

I care (instrumentally) about the well-being of others in the future

What is the terminal goal which the well-being of people after your death achieves?

knowing that they'll be tortured

Oh, sure, you shouldn't endorse knowing about it. But it would be best, by your lights, if I set things up that way in order to give you a tuna-fish sandwich, and kept you in ignorance. And you should agree to that in principle... right?

This makes me question whether their seemingly terminal values were actually their terminal values to begin with.

(nods) In the face of that uncertainty, how confident are you that your seemingly terminal values are actually your terminal values?

I don't think that someone who'd experience ideal wireheading would like anything else more.

(nods) I'm inclined to agree.

Comment author: blacktrance 14 January 2014 09:29:56PM -1 points [-]

What is the terminal goal which the well-being of people after your death achieves?

Knowing that the people I care about will have a good life after I'm gone contributes to my current happiness.

But it would be best, by your lights, if I set things up that way in order to give you a tuna-fish sandwich, and kept you in ignorance. And you should agree to that in principle... right?

No, because I also care about having true beliefs. I cannot endorse being tricked.

In the face of that uncertainty, how confident are you that your seemingly terminal values are actually your terminal values?

Given the amount of introspection I've done, having discussed this with others, etc, I'm very highly confident that my seemingly terminal values actually are my terminal values.

Comment author: TheOtherDave 14 January 2014 09:56:15PM *  0 points [-]

No, because I also care about having true beliefs. I cannot endorse being tricked.

No trickery involved. There's simply a fact about the world of which you're unaware. There's an Vast number of such facts, what's one more?

Comment author: VAuroch 14 January 2014 06:06:53AM 0 points [-]

What objection do you have with the argument "Most humans object to wireheading in principle, therefore wireheading is not the ideal state?"

Because it seems like a state most people would not choose voluntarily, is not ideal.

Comment author: blacktrance 14 January 2014 06:25:41AM 0 points [-]

Many humans have objected to many things in the past that are now widely accepted, so I don't find that to be a convincing objection.

Many people wouldn't choose it voluntarily, but some drug addicts wouldn't voluntarily choose to quit, either. Even if they didn't want it, they'd like it, and liking is what ultimately matters.