I do not believe that the Singularity is likely to happen any time soon, even in astronomical terms. Furthermore, I am far from convinced that, even if the Singularity were to happen, the transhuman AI would be able to achieve quasi-godlike status (i.e., it may never be able to reshape entire planets in a matter of minutes, rewrite everyone's DNA, travel faster than light, rewrite the laws of physics, etc.). In light of this, I believe that worrying about the friendliness of AI is kind of a waste of time.
I think I have good reasons for these beliefs, and I operate by Crocker's Rules, FWIW...
Furthermore, I am far from convinced that, even if the Singularity were to happen, the transhuman AI would be able to achieve quasi-godlike status [...] In light of this, I believe that worrying about the friendliness of AI is kind of a waste of time.
Anything that does not have sufficient intelligence to be considered a threat does not even remotely qualify as a 'Singularity'. (Your 'even if' really means 'just not gonna happen'.)
What do you believe that most people on this site don't?
I'm especially looking for things that you wouldn't even mention if someone wasn't explicitly asking for them. Stuff you're not even comfortable writing under your own name. Making a one-shot account here is very easy, go ahead and do that if you don't want to tarnish your image.
I think a big problem with a "community" dedicated to being less wrong is that it will make people more concerned about APPEARING less wrong. The biggest part of my intellectual journey so far has been the acquisition of new and startling knowledge, and that knowledge doesn't seem likely to turn up here in the conditions that currently exist.
So please, tell me the crazy things you're otherwise afraid to say. I want to know them, because they might be true.