Edit, May 21, 2012: Read this comment by Yvain.
Forming your own opinion is no more necessary than building your own furniture.
There's been a lot of talk here lately about how we need better contrarians. I don't agree. I think the Sequences got everything right and I agree with them completely. (This of course makes me a deranged, non-thinking, Eliezer-worshiping fanatic for whom the singularity is a substitute religion. Now that I have admitted this, you don't have to point it out a dozen times in the comments.) Even the controversial things, like:
- I think the many-worlds interpretation of quantum mechanics is the closest to correct and you're dreaming if you think the true answer will have no splitting (or I simply do not know enough physics to know why Eliezer is wrong, which I think is pretty unlikely but not totally discountable).
- I think cryonics is a swell idea and an obvious thing to sign up for if you value staying alive and have enough money and can tolerate the social costs.
- I think mainstream science is too slow and we mere mortals can do better with Bayes.
- I am a utilitarian consequentialist and think that if allow someone to die through inaction, you're just as culpable as a murderer.
- I completely accept the conclusion that it is worse to put dust specks in 3^^^3 people's eyes than to torture one person for fifty years. I came up with it independently, so maybe it doesn't count; whatever.
- I tentatively accept Eliezer's metaethics, considering how unlikely it is that there will be a better one (maybe morality is in the gluons?)
- "People are crazy, the world is mad," is sufficient for explaining most human failure, even to curious people, so long as they know the heuristics and biases literature.
- Edit, May 27, 2012: You know what? I forgot one: Gödel, Escher, Bach is the best.
There are two tiny notes of discord on which I disagree with Eliezer Yudkowsky. One is that I'm not so sure as he is that a rationalist is only made when a person breaks with the world and starts seeing everybody else as crazy, and two is that I don't share his objection to creating conscious entities in the form of an FAI or within an FAI. I could explain, but no one ever discusses these things, and they don't affect any important conclusions. I also think the sequences are badly-organized and you should just read them chronologically instead of trying to lump them into categories and sub-categories, but I digress.
Furthermore, I agree with every essay I've ever read by Yvain, I use "believe whatever gwern believes" as a heuristic/algorithm for generating true beliefs, and don't disagree with anything I've ever seen written by Vladimir Nesov, Kaj Sotala, Luke Muelhauser, komponisto, or even Wei Dai; policy debates should not appear one-sided, so it's good that they don't.
I write this because I'm feeling more and more lonely, in this regard. If you also stand by the sequences, feel free to say that. If you don't, feel free to say that too, but please don't substantiate it. I don't want this thread to be a low-level rehash of tired debates, though it will surely have some of that in spite of my sincerest wishes.
Holden Karnofsky said:
I believe I have read the vast majority of the Sequences, including the AI-foom debate, and that this content - while interesting and enjoyable - does not have much relevance for the arguments I've made.
I can't understand this. How could the sequences not be relevant? Half of them were created when Eliezer was thinking about AI problems.
So I say this, hoping others will as well:
I stand by the sequences.
And with that, I tap out. I have found the answer, so I am leaving the conversation.
Even though I am not important here, I don't want you to interpret my silence from now on as indicating compliance.
After some degree of thought and nearly 200 comment replies on this article, I regret writing it. I was insufficiently careful, didn't think enough about how it might alter the social dynamics here, and didn't spend enough time clarifying, especially regarding the third bullet point. I also dearly hope that I have not entrenched anyone's positions, turning them into allied soldiers to be defended, especially not my own. I'm sorry.
You seem to mostly disagree in spirit with all Grognor's points but the last, though on that point you didn't share your impression of the H&B literature.
I'll chime in and say that at some point about two years ago I would have more or less agreed with all six points. These days I disagree in spirit with all six points and with the approach to rationality that they represent. I've learned a lot in the meantime, and various people, including Anna Salamon, have said that I seem like I've gained fifteen or twenty IQ points. I've read all of Eliezer's posts maybe three times over and I've read many of the cited papers and a few books, so my disagreement likely doesn't stem from not having sufficiently appreciated Eliezer's sundry cases. Many times when I studied the issues myself and looked at a broader set of opinions in the literature, or looked for justifications of the unstated assumptions I found, I came away feeling stupid for having been confident of Eliezer's position: often Eliezer had very much overstated the case for his positions, and very much ignored or fought straw men of alternative positions.
His arguments and their distorted echoes lead one to think that various people or conclusions are obviously wrong and thus worth ignoring: that philosophers mostly just try to be clever and that their conclusions are worth taking seriously more-or-less only insofar as they mirror or glorify science; that supernaturalism, p-zombie-ism, theism, and other philosophical positions are clearly wrong, absurd, or incoherent; that quantum physicists who don't accept MWI just don't understand Occam's razor or are making some similarly simple error; that normal people are clearly biased in all sorts of ways, and that this has been convincingly demonstrated such that you can easily explain away any popular beliefs if necessary; that religion is bad because it's one of the biggest impediments to a bright, Enlightened future; and so on. It seems to me that many LW folk end up thinking they're right about contentious issues where many people disagree with them, even when they haven't looked at their opponents' best arguments, and even when they don't have a coherent understanding of their opponents' position or their own position. Sometimes they don't even seem to realize that there are important people who disagree with them, like in the case of heuristics and biases. Such unjustified confidence and self-reinforcing ignorance is a glaring, serious, fundamental, and dangerous problem with any epistemology that wishes to lay claim to rationality.
Disagree in spirit? What exactly does that mean?
(I happen to mostly agree with your comment while mostly agreeing with Grognor's points--hence my confusion in what you mean, exactly.)