MarkusRamikin comments on Thoughts on the Singularity Institute (SI) - Less Wrong

256 Post author: HoldenKarnofsky 11 May 2012 04:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1270)

You are viewing a single comment's thread. Show more comments above.

Comment author: MarkusRamikin 17 May 2012 07:56:56AM 3 points [-]

It is also now the 3rd most highly voted post

1st.

At this point even I am starting to be confused.

Comment author: TheOtherDave 17 May 2012 04:30:45PM 1 point [-]

Can you articulate the nature of your confusion?

Comment author: MarkusRamikin 17 May 2012 04:46:56PM *  7 points [-]

I suppose it's that I naively expect, when opening the list of top LW posts ever, to see ones containing the most impressive or clever insights into rationality.

Not that I don't think Holden's post deserves a high score for other reasons. While I am not terribly impressed with his AI-related arguments, the post is of the very highest standards of conduct, of how to have a disagreement that is polite and far beyond what is usually named "constructive".

Comment author: TheOtherDave 17 May 2012 05:17:18PM 4 points [-]

(nods) Makes sense.

My own primary inference from the popularity of this post is that there's a lot of uncertainty/disagreement within the community about the idea that creating an AGI without an explicit (and properly tuned) moral structure constitutes significant existential risk, but that the social dynamics of the community cause most of that uncertainty/disagreement to go unvoiced most of the time.

Of course, there's lots of other stuff going on as well that has little to do with AGI or existential risk, and a lot to do with the social dynamics of the community itself.

Comment author: [deleted] 14 June 2012 01:38:11PM 0 points [-]

Maybe. I upvoted it because it will have (and has had) the effect of improving SI's chances.

Comment author: aceofspades 07 June 2012 08:36:37PM 2 points [-]

Some people who upvoted the post may think it is one of the best-written and most important examples of instrumental rationality on this site.