Grognor comments on Muehlhauser-Goertzel Dialogue, Part 1 - Less Wrong

28 Post author: lukeprog 16 March 2012 05:12PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (161)

You are viewing a single comment's thread.

Comment author: Grognor 17 March 2012 07:53:19AM 27 points [-]

This exchange significantly decreased my probability that Ben Goertzel is a careful thinker about AI problems. I think he has a good point about "rationalists" being too much invested in "rationality" (as opposed to rationality), but his AI thoughts are just seriously wtf. In tune with the Cosmos? Does this mean anything at all? I hate to say it based on a short conversation, but it looks like Ben Goertzel hasn't made any of his intuitions precise enough to even be wrong. And he makes the classic mistake of thinking "any intelligence" would avoid certain goal-types (i.e. 'fill the future light cone with some type of substance') because they're... stupid? I don't even...

Quoth Yvain:

If I asked you to prove that colorless green ideas do not sleep furiously, you wouldn't know where or how to begin.

Comment author: Rain 17 March 2012 11:59:17PM *  3 points [-]

He published a book called A Cosmist Manifesto which presumably describes some of his thoughts in more detail. It looked too new-age for me to take much interest.

Comment author: Normal_Anomaly 17 March 2012 03:18:46PM 2 points [-]

Upvoted.

Goertzel's belief in AI FOOMs coupled with his beliefs in psi phenomena and the inherent stupidity of paperclipping made me lower my confidence in the likelihood of AI FOOMs slightly. Was this a reasonable operation, do you think?

Comment author: Giles 17 March 2012 06:21:16PM 5 points [-]

It depends.

  • If you were previously aware of Goertzel's belief in AI FOOM but not his opinions on psi/paperclipping then you should lower your confidence slightly. (Exactly how much depends on what other evidence/opinions you have to hand).
  • If the SIAI were wheeling out Goertzel as an example of "look, here's someone who believes in FOOM" then it should lower your confidence
  • If you were previously unaware of Goertzel's belief in FOOM then it should probably increase your confidence very slightly. Reversed stupidity is not intelligence

Obviously the quanitity of "slightly" depends on what other evidence/opinions you have to hand.

Comment author: Normal_Anomaly 17 March 2012 08:05:48PM 0 points [-]

This is a good analysis. I was previously weakly aware of Goertzel's beliefs on psi/paperclipping, and didn't know much about his opinions on AI other than that he was working on superhuman AGI but didn't have as much concern for Friendliness as SIAI. So I suppose my confidence shouldn't change very much either way. I'm still on the fence on several questions related to Singularitarianism, so I'm trying to get evidence wherever I can find it.