lukeprog comments on Thoughts on the Singularity Institute (SI) - Less Wrong

256 Post author: HoldenKarnofsky 11 May 2012 04:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1270)

You are viewing a single comment's thread.

Comment author: lukeprog 11 May 2012 10:13:23PM *  36 points [-]

This post is highly critical of SIAI — both of its philosophy and its organizational choices. It is also now the #1 most highly voted post in the entire history of LessWrong — higher than any posts by Eliezer or myself.

I shall now laugh harder than ever when people try to say with a straight face that LessWrong is an Eliezer-cult that suppresses dissent.

Comment author: Eliezer_Yudkowsky 12 May 2012 02:36:01PM *  13 points [-]

Either I promoted this and then forgot I'd done so, or someone else promoted it - of course I was planning to promote it, but I thought I'd planned to do so on Tuesday after the SIAIers currently running a Minicamp had a chance to respond, since I expected most RSS subscribers to the Promoted feed to read comments only once (this is the same reason I wait a while before promoting e.g. monthly quotes posts). On the other hand, I certainly did upvote it the moment I saw it.

Comment author: lukeprog 12 May 2012 05:23:12PM 2 points [-]

Original comment now edited; I wasn't aware anyone besides you might be promoting posts.

Comment author: JackV 12 May 2012 09:29:41AM 10 points [-]

I agree (as a comparative outsider) that the polite response to Holden is excellent. Many (most?) communities -- both online communities and real-world organisations, especially long-standing ones -- are not good at it for lots of reasons, and I think the measured response of evaluating and promoting Holden's post is exactly what LessWrong members would hope LessWrong could do, and they showed it succeeded.

I agree that this is good evidence that LessWrong isn't just an Eliezer-cult. (The true test would be if Elizier and another long-standing poster were dismissive to the post, and then other people persuaded them otherwise. In fact, maybe people should roleplay that or something, just to avoid getting stuck in an argument-from-authority trap, but that's a silly idea. Either way, the fact that other people spoke positively, and Elizier and other long-standing posters did too, is a good thing.)

However, I'm not sure it's as uniquely a victory for the rationality of LessWrong as it sounds. In responose to srdiamond, Luke quoted tenlier saying "[Holden's] critique mostly consists of points that are pretty persistently bubbling beneath the surface around here, and get brought up quite a bit. Don't most people regard this as a great summary of their current views, rather than persuasive in any way?" To me, that suggests that Holden did a really excellent job expressing these views clearly and persuasively. However, it suggests that previous people had tried to express something similar, but it hadn't been expressed well enough to be widely accepted, and people reading had failed to sufficiently apply the dictum of "fix your opponents' arguments for them". I'm not sure if that's true (it's certainly not automatically true), but I suspect it might be. What do people think?

If there's any truth to it, it suggests one good answer to the recent post http://lesswrong.com/lw/btc/how_can_we_get_more_and_better_lw_contrarians (whether that was desirable in general or not) would be, as a rationalist exercise for someone familiar with/to the community and good at writing rationally, to take a survey of contrarian views on the topic that people on the community may have had but not been able to express, and don't worry about showmanship like pretending to believe it yourself, but just say "I think what some people think is [well-expressed argument]. Do you agree that's fair? If so, do I and other people think they have a point?" Whether or not that argument is right it's still good to engage with it if many people are thinking it.

Comment author: pleeppleep 12 May 2012 05:30:48PM 4 points [-]

Third highest now. Eliezer just barely gets into the top 20.

Comment author: MarkusRamikin 17 May 2012 07:56:56AM 3 points [-]

It is also now the 3rd most highly voted post

1st.

At this point even I am starting to be confused.

Comment author: TheOtherDave 17 May 2012 04:30:45PM 1 point [-]

Can you articulate the nature of your confusion?

Comment author: MarkusRamikin 17 May 2012 04:46:56PM *  7 points [-]

I suppose it's that I naively expect, when opening the list of top LW posts ever, to see ones containing the most impressive or clever insights into rationality.

Not that I don't think Holden's post deserves a high score for other reasons. While I am not terribly impressed with his AI-related arguments, the post is of the very highest standards of conduct, of how to have a disagreement that is polite and far beyond what is usually named "constructive".

Comment author: TheOtherDave 17 May 2012 05:17:18PM 4 points [-]

(nods) Makes sense.

My own primary inference from the popularity of this post is that there's a lot of uncertainty/disagreement within the community about the idea that creating an AGI without an explicit (and properly tuned) moral structure constitutes significant existential risk, but that the social dynamics of the community cause most of that uncertainty/disagreement to go unvoiced most of the time.

Of course, there's lots of other stuff going on as well that has little to do with AGI or existential risk, and a lot to do with the social dynamics of the community itself.

Comment author: [deleted] 14 June 2012 01:38:11PM 0 points [-]

Maybe. I upvoted it because it will have (and has had) the effect of improving SI's chances.

Comment author: aceofspades 07 June 2012 08:36:37PM 2 points [-]

Some people who upvoted the post may think it is one of the best-written and most important examples of instrumental rationality on this site.

Comment author: [deleted] 12 May 2012 09:43:31AM 2 points [-]

I wish I could upvote this ten times.

Comment author: brazil84 13 February 2013 11:20:00PM 1 point [-]

Well perhaps the normal practice is cult-like and dissent-suppressing and this is an atypical break. Kind of like the fat person who starts eating salad instead of nachos while he watches football. And congratulates himself on his healthy eating even though he is still having donuts for breakfast and hamburgers and french fries for lunch.

Seems to me the test for suppression of dissent is not when a high-status person criticizes. The real test is when someone with medium or low status speaks out.

And my impression is that lesswrong does have problems along these lines. Not as bad as other discussion groups, but still.

Comment author: private_messaging 12 May 2012 07:16:58AM *  -2 points [-]

How's about you also have a critical discussion of 'where can be we wrong and how do we make sure we are actually competent' and 'can we figure out what the AI will actually do, using our tools?' instead of 'how do we communicate our awesomeness better' and 'are we communicating our awesomeness right' ?

This post is something that can't be suppressed without losing big time, and you not suppressing it is only a strong evidence that you are not completely stupid (which is great).

Comment author: Robin 15 May 2012 11:30:17AM *  -2 points [-]

But LW isn't reflective of SI, most of the people that voted on this article have no affiliation with SI. So the high number of upvotes is less reflective of SI welcoming criticism than LW being dissatisfied with the organization of SI.

Furthermore, this post's criticism of Eliezer's research less strong than its criticism of SI's organization . SI has always been somewhat open to criticism of its organizational structure and many of the current leadership of SI has criticized the organizational structure at some point. But who criticize Eliezer's research do not manage to rise in SI's research division and generally aren't well received even on LW (Roko).

Lastly, laughing at somebody when they call your organization a cult is not a convincing argument, they're more likely to think of your organization as a cult (at least they will think you are arrogant).