Wei_Dai comments on Thoughts on the Singularity Institute (SI) - Less Wrong

256 Post author: HoldenKarnofsky 11 May 2012 04:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1270)

You are viewing a single comment's thread.

Comment author: Wei_Dai 11 May 2012 02:45:15AM 50 points [-]

Is it just me, or do Luke and Eliezer's initial responses appear to send the wrong signals? From the perspective of an SI critic, Luke's comment could be interpreted as saying "for us, not being completely incompetent is worth bragging about", and Eliezer's as "we're so arrogant that we've only taken two critics (including Holden) seriously in our entire history". These responses seem suboptimal, given that Holden just complained about SI's lack of impressive accomplishments, and being too selective about whose feedback to take seriously.

Comment author: Will_Newsome 11 May 2012 03:47:13AM 19 points [-]

Eliezer's comment makes me think that you, specifically, should consider collecting your criticisms and putting them in Main where Eliezer is more likely to see them and take the time to seriously consider them.

Comment author: Wei_Dai 12 May 2012 06:44:22PM 0 points [-]

I replied here.

Comment author: Nick_Beckstead 11 May 2012 03:56:21AM 51 points [-]

While I have sympathy with the complaint that SI's critics are inarticulate and often say wrong things, Eliezer's comment does seem to be indicative of the mistake Holden and Wei Dai are describing. Most extant presentations of SIAI's views leave much to be desired in terms of clarity, completeness, concision, accessibility, and credibility signals. This makes it harder to make high quality objections. I think it would be more appropriate to react to poor critical engagement more along the lines of "We haven't gotten great critics. That probably means that we need to work on our arguments and their presentation," and less along the lines of "We haven't gotten great critics. That probably means that there's something wrong with the rest of the world."

Comment author: ChrisHallquist 11 May 2012 04:04:08AM 27 points [-]

This. I've been trying to write something about Eliezer's debate with Robin Hanson, but the problem I keep running up against is that Eliezer's points are not clearly articulated at all. Even making my best educated guesses about what's supposed to go in the gaps in his arguments, I still ended up with very little.

Comment author: jacob_cannell 17 May 2012 09:04:05AM 4 points [-]

Have the key points of that 'debate' subsequently been summarized or clarified on LW? I found that debate exasperating in that Hanson and EY were mainly talking past each other and couldn't seem to hone in on their core disagreements.

I know it generally has to do with hard takeoff / recursive self-improvement vs more gradual EM revolution, but that's not saying all that much.

Comment author: Kaj_Sotala 17 May 2012 07:13:22PM 13 points [-]

I'm in the process of writing a summary and analysis of the key arguments and points in that debate.

The most recent version runs at 28 pages - and that's just an outline.

Comment author: somervta 17 January 2013 09:02:44AM 0 points [-]

If you need help with grunt work, please send me a message. If (as I suspect is the case) not, then good luck!

Comment author: Kaj_Sotala 18 January 2013 07:29:27AM 0 points [-]

Thanks, I'm fine. I posted a half-finished version here, and expect to do some further refinements soon.

Comment author: jacob_cannell 17 May 2012 11:14:36PM 0 points [-]

Awesome, look forward to it. I'd offer to help but I suspect that wouldn't really help. I'll just wax enthusiastic.

Comment author: private_messaging 17 May 2012 07:08:08AM *  1 point [-]

This. Well, the issue is the probability that it's just gaps. Ultimately, its the sort of thing that would only constitute a weak argument from authority iff the speaker had very very impressive accomplishments. Otherwise you're left assuming simplest explanation which doesn't involve presence of unarticulated points of any importance.

A gapless argument, like math proof, could trump authority if valid... an argument with gaps, on the other hand, is the one that is very prone to being trumped.

Comment author: Nick_Beckstead 11 May 2012 05:11:05AM 5 points [-]

In fairness I should add that I think Luke M agrees with this assessment and is working on improving these arguments/communications.

Comment author: lukeprog 11 May 2012 07:21:31PM 8 points [-]

Agree with all this.

Comment author: magfrump 11 May 2012 04:50:01AM 7 points [-]

Luke's comment addresses the specific point that Holden made about changes in the organization given the change in leadership.

Holden said:

I'm aware that SI has relatively new leadership that is attempting to address the issues behind some of my complaints. I have a generally positive impression of the new leadership; I believe the Executive Director and Development Director, in particular, to represent a step forward in terms of being interested in transparency and in testing their own general rationality. So I will not be surprised if there is some improvement in the coming years, particularly regarding the last couple of statements listed above. That said, SI is an organization and it seems reasonable to judge it by its organizational track record, especially when its new leadership is so new that I have little basis on which to judge these staff.

Luke attempted to provide (for the reader) a basis on which to judge these staff members.

Eliezer's response was... characteristic of Eliezer? And also very short and coming at a busy time for him.

Comment author: Nebu 31 December 2012 12:15:42PM 0 points [-]

Eliezer's response was... characteristic of Eliezer? And also very short and coming at a busy time for him.

I think that's Wei_Dai's point, that these "characteristic" replies are fine if you're used to him, but are bad if you don't.

Comment author: magfrump 31 December 2012 07:26:19PM 1 point [-]

Yeah I mean, as time goes on I think more and more of Eliezer as being kind of a jerk. I thought Luke's post was good, and Eliezer's wasn't, but I also expected longer posts to be forthcoming (which they were).

Comment author: lukeprog 11 May 2012 07:15:52PM 18 points [-]

Luke's comment could be interpreted as saying "for us, not being completely incompetent is worth bragging about"

Really? I personally feel pretty embarrassed by SI's past organizational competence. To me, my own comment reads more like "Wow, SI has been in bad shape for more than a decade. But at least we're improving very quickly."

Also, I very much agree with Beckstead on this: "Most extant presentations of SIAI's views leave much to be desired in terms of clarity, completeness, concision, accessibility, and credibility signals. This makes it harder to make high quality objections." And also this: "We haven't gotten great critics. That probably means that we need to work on our arguments and their presentation."

Comment author: Wei_Dai 11 May 2012 08:37:07PM 13 points [-]

Really?

Yes, I think it at least gives a bad impression to someone, if they're not already very familiar with SI and sympathetic to its cause. Assuming you don't completely agree with the criticisms that Holden and others have made, you should think about why they might have formed wrong impressions of SI and its people. Comments like the ones I cited seem to be part of the problem.

I personally feel pretty embarrassed by SI's past organizational competence. To me, my own comment reads more like "Wow, SI has been in bad shape for more than a decade. But at least we're improving very quickly."

That's good to hear, and thanks for the clarifications you added.

Comment author: Polymeron 20 May 2012 06:05:14PM 1 point [-]

It's a fine line though, isn't it? Saying "huh, looks like we have much to learn, here's what we're already doing about it" is honest and constructive, but sends a signal of weakness and defensiveness to people not bent on a zealous quest for truth and self-improvement. Saying "meh, that guy doesn't know what he's talking about" would send the stronger social signal, but would not be constructive to the community actually improving as a result of the criticism.

Personally I prefer plunging ahead with the first approach. Both in the abstract for reasons I won't elaborate on, but especially in this particular case. SI is not in a position where its every word is scrutinized; it would actually be a huge win if it gets there. And if/when it does, there's a heck of a lot more damning stuff that can be used against it than an admission of past incompetence.

Comment author: Vaniver 20 May 2012 06:16:16PM 0 points [-]

sends a signal of weakness and defensiveness to people not bent on a zealous quest for truth and self-improvement.

I do not see why this should be a motivating factor for SI; to my knowledge, they advertise primarily to people who would endorse a zealous quest for truth and self-improvement.

Comment author: Polymeron 20 May 2012 06:25:30PM 2 points [-]

That subset of humanity holds considerably less power, influence and visibility than its counterpart; resources that could be directed to AI research and for the most part aren't. Or in three words: Other people matter. Assuming otherwise would be a huge mistake.

I took Wei_Dai's remarks to mean that Luke's response is public, and so can reach the broader public sooner or later; and when examined in a broader context, that it gives off the wrong signal. My response was that this was largely irrelevant, not because other people don't matter, but because of other factors outweighing this.

Comment author: ciphergoth 11 May 2012 06:34:15AM 4 points [-]

Are there other specific critiques you think should have made Eliezer's list, or is it that you think he should not have drawn attention to their absence?

Comment author: Wei_Dai 11 May 2012 07:39:41AM 26 points [-]

Are there other specific critiques you think should have made Eliezer's list, or is it that you think he should not have drawn attention to their absence?

Many of Holden's criticisms have been made by others on LW already. He quoted me in Objection 1. Discussion of whether Tool-AI and Oracle-AI are or are not safe have occurred numerous times. Here's one that I was involved in. Many people have criticized Eliezer/SI for not having sufficiently impressive accomplishments. Cousin_it and Silas Barta have questioned whether the rationality techniques being taught by SI (and now the rationality org) are really effective.

Comment author: Furcas 11 May 2012 03:15:54AM *  23 points [-]

Luke isn't bragging, he's admitting that SI was/is bad but pointing out it's rapidly getting better. And Eliezer is right, criticisms of SI are usually dumb. Could their replies be interpreted the wrong way? Sure, anything can be interpreted in any way anyone likes. Of course Luke and Eliezer could have refrained from posting those replies and instead posted carefully optimized responses engineered to send nothing but extremely appealing signals of humility and repentance.

But if they did turn themselves into politicians, we wouldn't get to read what they actually think. Is that what you want?

Comment author: Wei_Dai 11 May 2012 08:30:50AM *  27 points [-]

Luke isn't bragging, he's admitting that SI was/is bad but pointing out it's rapidly getting better.

But the accomplishments he listed (e.g., having a strategic plan, website redesign) are of the type that Holden already indicated to be inadequate. So why the exhaustive listing, instead of just giving a few examples to show SI is getting better and then either agreeing that they're not yet up to par, or giving an argument for why Holden is wrong? (The reason I think he could be uncharitably interpreted as bragging is that he would more likely exhaustively list the accomplishments if he was proud of them, instead of just seeing them as fixes to past embarrassments.)

And Eliezer is right, criticisms of SI are usually dumb.

I'd have no problem with "usually" but "all except two" seems inexcusable.

But if they did turn themselves into politicians, we wouldn't get to read what they actually think. Is that what you want?

Do their replies reflect their considered, endorsed beliefs, or were they just hurried remarks that may not say what they actually intended? I'm hoping it's the latter...

Comment author: Kaj_Sotala 11 May 2012 10:10:04AM *  38 points [-]

But the accomplishments he listed (e.g., having a strategic plan, website redesign) are of the type that Holden already indicated to be inadequate. So why the exhaustive listing, instead of just giving a few examples to show SI is getting better and then either agreeing that they're not yet up to par, or giving an argument for why Holden is wrong?

Presume that SI is basically honest and well-meaning, but possibly self-deluded. In other words, they won't outright lie to you, but they may genuinely believe that they're doing better than they really are, and cherry-pick evidence without realizing that they're doing so. How should their claims of intending to get better be evaluated?

Saying "we're going to do things better in the future" is some evidence about SI intending to do better, but rather weak evidence, since talk is cheap and it's easy to keep thinking that you're really going to do better soon but there's this one other thing that needs to be done first and we'll get started on the actual improvements tomorrow, honest.

Saying "we're going to do things better in the future, and we've fixed these three things so far" is stronger evidence, since it shows that you've already began fixing problems and might keep up with it. But it's still easy to make a few improvements and then stop. There are far more people who try to get on a diet, follow it for a while and then quit than there are people who actually diet for as long as they initially intended to do.

Saying "we're going to do things better in the future, and here's the list of 18 improvements that we've implemented so far" is much stronger evidence than either of the two above, since it shows that you've spent a considerable amount of effort on improvements over an extended period of time, enough to presume that you actually care deeply about this and will keep up with it.

I don't have a cite at hand, but it's been my impression that in a variety of fields, having maintained an activity for longer than some threshold amount of time is a far stronger predictor of keeping up with it than having maintained it for a shorter time. E.g. many people have thought about writing a novel and many people have written the first five pages of a novel. But when considering the probability of finishing, the difference between the person who's written the first 5 pages and the person who's written the first 50 pages is much bigger than the difference between the person who's written the first 100 pages and the person who's written the first 150 pages.

There's a big difference between managing some performance once, and managing sustained performance over an extended period of time. Luke's comment is far stronger evidence of SI managing sustained improvements over an extended period of time than a comment just giving a few examples of improvement.

Comment author: private_messaging 12 May 2012 03:49:46PM 0 points [-]

I don't think there's a sharp distinction between self deception and effective lying. For the lying you have to run some process with the falsehood taken as true.

Comment author: Kaj_Sotala 13 May 2012 07:19:52AM 4 points [-]

The main difference is that if there's reason to presume that they're lying, any claims of "we've implemented these improvements" that you can't directly inspect become worthless. Right now, if they say something like "Meetings with consultants about bookkeeping/accounting; currently working with our accountant to implement best practices and find a good bookkeeper", I trust them enough to believe that they're not just making it up even though I can't personally verify it.

Comment author: Eugine_Nier 13 May 2012 06:25:53PM 3 points [-]

On the other had, you can't trust their claims that these meetings are accomplishing anything.

Comment author: Kaj_Sotala 13 May 2012 07:11:04PM 1 point [-]

True.

Comment author: lukeprog 11 May 2012 07:26:57PM *  1 point [-]

I've added a clarifying remark at the end of this comment and another at the end of this comment.

Comment author: thomblake 11 May 2012 07:34:11PM 8 points [-]

I think it's unfair to take Eliezer's response as anything other than praise for this article. He noted already that he did not have time to respond properly.

And why even point out that a human's response to anything is "suboptimal"? It will be notable when a human does something optimal.

Comment author: faul_sname 11 May 2012 10:22:58PM 9 points [-]

We do, on occasion, come up with optimal algorithms for things. Also, "suboptimal" usually means "I can think of several better solutions off the top of my head", not "This solution is not maximally effective".

Comment author: ChrisHallquist 11 May 2012 03:58:27AM 5 points [-]

I read Luke's comment just as "I'm aware these are issues and we're working on it." I didn't read him as "bragging" about the ones that have been solved. Eliezer's... I see the problem with. I initially read it as just commenting Holden on his high-quality article (which I agree was high-quality), but I can see it being read as backhanded at anyone else who's criticized SIAI.

Comment author: private_messaging 11 May 2012 07:13:02AM *  0 points [-]

It's the correct signals. The incompetents inherently signal incompetence, the competence can't be faked beyond superficial level (and faking competence is all about signalling that you are sure you are competent). The lack of feedback is inherent in the assumption behind 'we are sending wrong signal' rather than 'maybe, we really are incompetent'.

Comment author: [deleted] 11 May 2012 08:39:33AM 0 points [-]

I kind-of agree about Eliezer's comment, but Luke's doesn't sound like that to me.

Comment author: [deleted] 11 May 2012 08:41:22AM 5 points [-]

Retracted. I've just re-read Eliezer's comment more calmly, and it's not that bad either.