(nods) I see.
So, if you were simply skeptical about Yudkowsky/SIAI, you could dismiss them and walk away. But since you're emotionally involved and feel like you have to make it all go away in order to feel better, that's not an option for you.
The problem is, what you're doing isn't going to work for you either. You're just setting yourself up for a rather pointless and bitter conflict.
Surely this isn't a unique condition? I mean, there are plenty of groups out there who will tell you that there are various bad things that might happen if you don't read their book, donate to their organization, etc., etc., and you don't feel the emotional need to make them go away. You simply ignore them, or at least most of them.
How do you do that? Perhaps you can apply the same techniques here.
How do you do that? Perhaps you can apply the same techniques here.
I managed to do that with Jehovas Witnesses. I grew up being told that I have to tell people about Jehovas Witnesses so that they will be salvaged. It is my responsibility. But this here is on a much more sophisticated level. It includes all the elements of organized religion mixed up with science and math. Incidentally one of the first posts I read was Why Our Kind Can't Cooperate:
...The obvious wrong way to finish this thought is to say, "Let's do what the Raelians do! Let's add s
Artificial general intelligence researcher Ben Goertzel answered my question on charitable giving and gave his permission to publish it here. I think the opinion of highly educated experts who have read most of the available material is important to estimate the public and academic perception of risks from AI and the effectiveness with which the risks are communicated by LessWrong and the SIAI.
Alexander Kruel asked:
Ben Goertzel replied:
What can one learn from this?
I'm planning to contact and ask various experts, who are aware of risks from AI, the same question.