blacktrance comments on AALWA: Ask any LessWronger anything - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (611)
In the unlikely even that anyone is interested, sure, ask me anything.
Edit: Ethics are a particular interest of mine.
Would you rather fight one horse sized duck, or a hundred duck sized horses?
Depends on the situation. Do I have to kill whatever I'm fighting, or do I just have to defend myself? If it's the former, the horse-sized duck, because duck-sized horses would be too good at running away and hiding. If it's the latter, then the duck- horses, because they'd be easier to scatter.
Is this a fist-fight or can blacktrance use weapons?
Any topics of interest? Same goes for other 'whatever's
Ethics, I suppose. Most of my other interests are either probably too mindkilling for LW or are written about in the Sequences already, more clearly than I could write about them.
What are your Sequence-superseded interests? Would you please name three points from anywhere within them where your opinion differs (even if minorly) from EY (or the author of the most relevant sequence if different)?
My sequence-superseded interests include the nature of free will, self-improvement (in the sense of luminosity, not productivity), and general interest in rational thinking.
Three areas where I disagree with the Sequences:
Fake Selfishness. EY mistakenly treats "selfishness" as something like wealth maximization, or at least something that excludes caring about others. Selfishness means acting in one's self-interest. There are three major philosophical views as to what people's interests are: hedonism (pleasure), preference satisfaction, and objective-list (i.e. if a person has the things on this list, their interests are being fulfilled). Wealth maximization is only a plausible manifestation of self-interest for a person with very atypical preferences or for an unusual list. There is no reason why egoism would automatically exclude caring about others - in fact, caring about others often makes people happy, and fulfills their preferences. As for the assumption in the sentence "Shouldn't you be trying to persuade me to be more altruistic, so you can exploit me?", that ignores virtue ethical egoism, as in the Epicurean tradition - that is, exploiting people (in the sense in which exploitation is bad) is not conducive to happiness, and that being honest, just, benevolent, etc, is actually in one's self-interest.
Not for the Sake of Happiness Alone. EY fails to apply reductionism to human values. He says, "I care about terminal values X, Y, and Z", but when it comes down to it, people would really like pleasure more than anything else, and the distinction between wanting and liking is irrelevant. To indulge in a bit of psychologizing, I think that trying to depict multiple values as irreducible comes from an aversion to wireheading - because if you conclude that all values reduce to happiness/pleasure, you must also conclude that wireheading is the ideal state. But I don't share this aversion - wireheading is the ideal state.
Because of the above, I disagree with basically the entirety of the Fun Theory sequence. It seems to be an attempt to reconcile Transhumanism as Simplified Humanism with not wanting to wirehead, and the two really aren't reconcilable - and Transhumanism as Simplified Humanism is correct.
Is there anyone in the world whose well-being you care strongly about?
Yes, myself and others, though the well-being of others is an instrumental value.
To confirm: you're the only person whose well-being you care about "terminally"?
Yes.
(nods) OK. Accepting that claim as true, I agree that you should endorse wireheading.
(Also that you should endorse having everyone in the world suffer for the rest of their lives after your death, in exchange for you getting a tuna fish sandwich right now, because hey, a tuna fish sandwich is better than nothing.)
Do you believe that nobody else in the world "terminally" cares about the well-being of others?
What objection do you have with the argument "Most humans object to wireheading in principle, therefore wireheading is not the ideal state?"
Because it seems like a state most people would not choose voluntarily, is not ideal.
Many humans have objected to many things in the past that are now widely accepted, so I don't find that to be a convincing objection.
Many people wouldn't choose it voluntarily, but some drug addicts wouldn't voluntarily choose to quit, either. Even if they didn't want it, they'd like it, and liking is what ultimately matters.