Blueberry comments on That other kind of status - Less Wrong

72 Post author: Yvain 29 December 2009 02:45AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (108)

You are viewing a single comment's thread. Show more comments above.

Comment author: multifoliaterose 13 June 2010 08:58:16AM 7 points [-]

Eliezer has said that "it seems pretty obvious to me that some point in the not-too-distant future we're going to build an AI [...] it will be a superintelligence relative to us [...] in one to ten decades and probably on the lower side of that." ---- http://bloggingheads.tv/diavlogs/21857

The vast majority of very smart and accomplished people (e.g. Nobel prize winners in sciences, Fields medalists, founders of large tech corporations) do not subscribe to the view that the "singularity is near." This raises a strong possibility that people like Eliezer who think that it's pretty obvious that "the singularity is near" are deluded for the same reason that the 9-11 Truthers are. As Yvain says, it's a boost to one's self esteem to feel that one has "figured out a deep and important secret that the rest of the world is too complacent to realize."

Has there been any discussion of this matter in the Less Wrong archives?

Comment author: Blueberry 13 June 2010 05:24:41PM 0 points [-]

Eliezer may well be off on the time scale. I would guess he's an order of magnitude off. But an incorrect guess about the timescale of a future event does not give rise to a strong possibility that he's deluded, like the 9-11 Truthers, for ego reasons. Downvoted, because this reads more like an insult than a reasoned question.