I was too lazy to write this up again, it's copy and paste work so don't mind some inconsistencies. Regarding the quotes, I think that EY seriously believes what he says in the given quotes, otherwise I wouldn't have posted them. I'm not even suggesting that it isn't true, I actually allow for the possibility that he is that smart. But I want to know what I should do and right now I don't see any good arguments.
I'm a supporter and donor and what I'm trying to do here is coming up with the best possible arguments to undermine the credence of the SIAI. Almost nobody else is doing that, so I'm trying my best here. This isn't damaging, this is helpful. Because once you become really popular, people like P.Z. Myers and other much more eloquent and popular people will pull you to pieces if you can't even respond to my poor attempt at being a devils advocate.
I don't have the difficulty comprehending the premises either the ones you have questions here or the others required to make an adequate evaluation for the purpose of decision making.
I don't even know where to start here, so I won't. But I haven't come across anything yet that I had trouble understanding.
I rather suspect that if all those demands were meant you would go ahead and find new rhetorical demands to make.
See that women with red hair? Well, the cleric told me that he believes that she's a witch. But he'll update on evidence if the fire didn't consume her. I said red hair is insufficient data to support that hypothesis and take such extreme measures to test it. He told me that if he came up with more evidence like sorcery I'd just go ahead and find new rhetorical demands.
You appear to be suggesting that Eliezer should censor presentation of his thoughts on the subject so as to prevent people from having nightmares. Spot the irony! ;)
I'm not against free speech and religious freedom but that also applies for my own thoughts on the subject. I believe he could do much more than censoring certain ideas, namely show that they are bogus.
He told me that if he came up with more evidence like sorcery I'd just go ahead and find new rhetorical demands.
[See context for implied meaning if the excerpt isn't clear]. I claimed approximately the same thing that you say yourself below.
I'm a supporter and donor and what I'm trying to do here is coming up with the best possible arguments to undermine the credence of the SIAI. Almost nobody else is doing that, so I'm trying my best here.
I've got nothing against the Devil, it's the Advocacy that is mostly bullshit. Saying you are 'Devil's Advoca...
[...] SIAI's Scary Idea goes way beyond the mere statement that there are risks as well as benefits associated with advanced AGI, and that AGI is a potential existential risk.
[...] Although an intense interest in rationalism is one of the hallmarks of the SIAI community, still I have not yet seen a clear logical argument for the Scary Idea laid out anywhere. (If I'm wrong, please send me the link, and I'll revise this post accordingly. Be aware that I've already at least skimmed everything Eliezer Yudkowsky has written on related topics.)
So if one wants a clear argument for the Scary Idea, one basically has to construct it oneself.
[...] If you put the above points all together, you come up with a heuristic argument for the Scary Idea. Roughly, the argument goes something like: If someone builds an advanced AGI without a provably Friendly architecture, probably it will have a hard takeoff, and then probably this will lead to a superhuman AGI system with an architecture drawn from the vast majority of mind-architectures that are not sufficiently harmonious with the complex, fragile human value system to make humans happy and keep humans around.
The line of argument makes sense, if you accept the premises.
But, I don't.
Ben Goertzel: The Singularity Institute's Scary Idea (and Why I Don't Buy It), October 29 2010. Thanks to XiXiDu for the pointer.