The whole libertarian vs socialism thing is one area where transhumanism imports elements of cultishness. If you are already a libertarian and you become familiar with transhumanism, you will probably import your existing arguments against socialism into your transhumanist perspective. Same for socialism. So you see various transhumanist organizations having political leadership struggles between socialist and libertarian factions who would probably be having the same struggles if they were a part of an international Chess club or some such other group.
The whole thing becomes entrenched in debates about things like Transition Guides and what amounts to "how to implement transhumanist policy in a [socialist/libertarian] way that's best for everyone." I always thought these discussions were what amounted to discourse at the "fandom" level of the transhumanist community, but after reading some of Eliezer's posts about his own experiences at transhumanist/singularitarian events I see that it happens at all levels.
Half-formed thought I need to pursue more offline but I'll write it down now: If you say "I am a transhumanist" and you say "I am a libertarian" and then you try to find libertarian ways to meet transhumanist goals you have made your transhumanism subservient to your libertarianism. I think it is better to find transhumanist ways to meet libertarian goals. The fact that a group of transhumanists would derail a debate by getting into politics seems to express to me that the group has made transhumanism the subservient value. Which seems inelegant given that transhumanism is probably the simpler value. Seems like there's a possible post for my own blog brewing in there, but I have to think about it some.
Yesterday, "Overcoming Cryonics" wrote:
One, there is nothing in the Overcoming Bias posting policy against transhumanism.
Two, as a matter of fact, I do try to avoid proselytizing here. I have other forums in which to vent my thoughts on transhumanism. When I write a blog post proselytizing transhumanism, it looks like this, this, or this.
But it's hard for me to avoid all references to transhumanism. "Overcoming Cryonics" commented to a post in which there was exactly one reference to a transhumanist topic. I had said:
What, exactly, am I supposed to do about that? The first time I ever got up on stage, I was in fact talking about the Singularity! That's the actual history! Transhumanism is not a hobby for me, it's my paid day job as a Research Fellow of the Singularity Institute. Asking me to avoid all mentions of transhumanism is like asking Robin Hanson to avoid all mentions of academia.
Occasionally, someone remarks that I seem to take notions like the Singularity on faith, because I mention them but don't defend them.
I don't defend my views here. Because I know that not everyone is interested in the considerable volume of work I have produced on transhumanism. Which you can find on yudkowsky.net.
If, however, you don't like any mention of transhumanism, even as an illustration of some other point about rationality - well, this is a blog. These are blog posts. They are written in the first person. I am occasionally going to use anecdotes from my history, or even, y'know, transcribe my thought processes a little?
Given the amount of time that I spend thinking about transhumanism, I naturally tend to think of transhumanist illustrations for my points about rationality. If I had spent the last eleven years as a geologist, I would find it easy to illustrate my ideas by talking about rocks. If you don't like my illustrations and think you can do better, feel free to invent superior illustrations and post them in the comments. I may even adopt them.
On some transhumanist topics, such as cryonics, I haven't written all that much myself. But there is plenty about cryonics at Alcor or Cryonics Institute. Also, the Transhumanist FAQ has some nice intros. If you don't want it discussed here, then why are you asking?
I will probably post explicitly on cryonics at some point, because I think there are some points about sour grapes for which I would have difficulty finding an equally strong illustration. Meanwhile, yes, I sometimes do mention "cryonics" as the archetype for a socially weird belief which happens to be true. No matter what I use as an example of "socially weird but true", some people are going to disagree with it. Otherwise it wouldn't be an example. And weird-but-true is certainly an important topic in rationality - otherwise there would be a knockdown argument against ever dissenting.
Even after checking the referenced sources, you might find that you - gasp! - still disagree with me. Oh, the horror! The horror! You don't read any other blogs where one of the authors occasionally disagrees with you.
Just because this blog is called Overcoming Bias, it does not mean that any time any author says something you disagree with, you should comment "OMG! How biased! I am sooo disappointed in you I thought you would do better." Part of the art of rationality is having extended discussions with people you disagree with. "OMG U R BIASED!" does not present much basis for continuing discussion.
It is a good rule of thumb that you should never flatly accuse someone of being "biased". Name the specific bias that attaches to the specific problem. Conjunction fallacy? Availability?
If you disagree with someone, you presumably think they're doing something wrong. Saying "You are like so totally biased, dude" is not helpful. If you strike a tragic, sorrowful pose and go, "Oh, alas, oh, woe, I am so disappointed in you," it is still not helpful. If you point to a specific belief that you disagree with, and say, "See, that belief is biased," then that doesn't convey any additional information beyond "I disagree with that belief." Which bias? There's quite a lot of possibilities.
If you think that "rationality" means people will agree with you on their first try, so that anyone who doesn't do this can be dismissed out of hand as a poseur, you have an exaggerated idea of how obvious your beliefs are.
So stop telling me, or Robin Hanson, "Why, you... you... you're not absolutely rational!" We already know that.
Just because I try to be rational doesn't mean I think I'm a god.
Well, sure, I want to be a god when I grow up, but that is like a totally different issue from that first part.
Except that both goals involve Bayesian methods.
(And are intertwined in other ways you won't realize until it's too late to turn back.)
Thank you.
Yours in the darkest abyssal depths of sincerity,
Eliezer Yudkowsky.