MarkusRamikin comments on Thoughts on the Singularity Institute (SI) - Less Wrong

256 Post author: HoldenKarnofsky 11 May 2012 04:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1270)

You are viewing a single comment's thread. Show more comments above.

Comment author: MarkusRamikin 14 May 2012 03:41:32PM 28 points [-]

You're allowed to say these things on the public Internet?

I just fell in love with SI.

Comment author: lukeprog 26 May 2012 12:33:50AM *  21 points [-]

You're allowed to say these things on the public Internet?

Well, at our most recent board meeting I wasn't fired, reprimanded, or even questioned for making these comments, so I guess I am. :)

Comment author: TheOtherDave 14 May 2012 04:20:43PM 8 points [-]

Well, all we really know is that he chose to. It may be that everyone he works with then privately berated him for it.
That said, I share your sentiment.
Actually, if SI generally endorses this sort of public "airing of dirty laundry," I encourage others involved in the organization to say so out loud.

Comment author: shminux 14 May 2012 06:04:43PM 18 points [-]

I just fell in love with SI.

It's Luke you should have fallen in love with, since he is the one turning things around.

Comment author: wedrifid 26 May 2012 02:24:14AM 43 points [-]

It's Luke you should have fallen in love with, since he is the one turning things around.

On the other hand I can count with one hand the number of established organisations I know of that would be sociologically capable of ceding power, status and control to Luke the way SingInst did. They took an untrained intern with essentially zero external status from past achievements and affiliations and basically decided to let him run the show (at least in terms of publicly visible initiatives). It is clearly the right thing for SingInst to do and admittedly Luke is very tall and has good hair which generally gives a boost when it comes to such selections - but still, making the appointment goes fundamentally against normal human behavior.

(Where I say "count with one hand" I am not including the use of any digits thereupon. I mean one.)

Comment author: Matt_Simpson 19 July 2012 07:05:00PM 7 points [-]

...and admittedly Luke is very tall and has good hair which generally gives a boost when it comes to such selections...

It doesn't matter that I completely understand why this phrase was included, I still found it hilarious in a network sitcom sort of way.

Comment author: [deleted] 14 May 2012 07:58:32PM *  0 points [-]

Consider the implications in light of the HoldenKarnofsky's critique about SI pretensions to high rationality.

  1. Rationality is winning.

  2. SI, at the same time as it was claiming extraordinary rationality, was behaving in ways that were blatantly irrational.

  3. Although this is supposedly due to "the usual causes," rationality (winning) subsumes overcoming akrasia.

  4. HoldenKarnofsky is correct that SI made claims for its own extraordinary rationality at a time when its leaders weren't rational.

  5. Further: why should anyone give SI credibility today—when it stands convicted of self-serving misrepresentation in the recent past?

Comment author: ciphergoth 15 May 2012 06:26:06AM 5 points [-]

You've misread the post - Luke is saying that he doesn't think the "usual defeaters" are the most likely explanation.

Comment author: lukeprog 25 May 2012 05:42:34PM 3 points [-]

Correct.

Comment author: thomblake 14 May 2012 08:03:44PM 5 points [-]

As a minor note, observe that claims of extraordinary rationality do not necessarily contradict claims of irrationality. The sanity waterline is very low.

Comment author: TheOtherDave 14 May 2012 09:12:55PM 5 points [-]

Do you mean to imply in context here that the organizational management of SIAI at the time under discussion was above average for a nonprofit organization? Or are you just making a more general statement that a system can be irrational while demonstrating above average rationality? I certainly agree with the latter.

Comment author: ciphergoth 15 May 2012 06:30:46AM 9 points [-]

Are you comparing it to the average among nonprofits started, or nonprofits extant? I would guess that it was well below average for extant nonprofits, but about or slightly above average for started nonprofits. I'd guess that most nonprofits are started by people who don't know what they're doing and don't know what they don't know, and that SI probably did slightly better because the people who were being a bit stupid were at least very smart, which can help. However, I'd guess that most such nonprofits don't live long because they don't find a Peter Thiel to keep them alive.

Comment author: David_Gerard 16 May 2012 11:07:48AM 6 points [-]

Your assessment looks about right to me. I have considerable experience of averagely-incompetent nonprofits, and SIAI looks normal to me. I am strongly tempted to grab that "For Dummies" book and, if it's good, start sending copies to people ...

Comment author: TheOtherDave 15 May 2012 12:44:48PM 0 points [-]

In the context of thomblake's comment, I suppose nonprofits started is the proper reference class.

Comment author: thomblake 15 May 2012 01:51:19PM 0 points [-]

Or are you just making a more general statement that a system can be irrational while demonstrating above average rationality?

Yes, this.

On an arbitrary scale I just made up, below 100 degrees of rationality is "irrational", and 0 degrees of rationality is "ordinary". 50 is extraordinarily rational and yet irrational.

Comment author: shminux 14 May 2012 08:10:09PM *  0 points [-]

Just to let you know, you've just made it on my list of the very few LW regulars I no longer bother replying to, due to the proven futility of any communications. In your case it is because you have a very evident ax to grind, which is incompatible with rational thought.

Comment author: metaphysicist 14 May 2012 08:34:42PM 1 point [-]

This comment seems strange. Is having an ax to grind opposed to rationality? Then why does Eliezer Yudkowsky, for example, not hesitate to advocate for causes such as friendly AI? Doesn't he have an ax to grind? More of one really, since this ax chops trees of gold.

It would seem intellectual honesty would require that you say you reject discussions with people with an ax to grind, unless you grind a similar ax.

Comment author: shminux 14 May 2012 08:46:21PM *  1 point [-]

From http://www.usingenglish.com: "If you have an axe to grind with someone or about something, you have a grievance, a resentment and you want to get revenge or sort it out." One can hardly call the unacknowledged emotions of resentment and needing a revenge/retribution compatible with rationality. srdiamond piled a bunch of (partially correct but irrelevant in the context of my comment) negative statements about SI, making these emotions quite clear.

Comment author: metaphysicist 14 May 2012 09:17:48PM 0 points [-]

That's a restrictive definition of "ax to grind," by the way—it's normally used to mean any special interest in the subject: "an ulterior often selfish underlying purpose <claims that he has no ax to grind in criticizing the proposed law>" (Merriam-Webster's Collegiate Dictionary)

But I might as well accept your meaning for discussion purposes. If you detect unacknowledged resentment in srdiamond, don't you detect unacknowledged ambition in Eliezer Yudkowsky?

There's actually good reason for the broader meaning of "ax to grind." Any special stake is a bias. I don't think you can say that someone who you think acts out of resentment, like srdiamond, is more intractably biased than someone who acts out of other forms of narrow self-interest, which almost invariably applies when someone defends something he gets money from.

I don't think it's a rational method to treat people differently, as inherently less rational, when they seem resentful. It is only one of many difficult biases. Financial interest is probably more biasing. If you think the arguments are crummy, that's something else. But the motive--resentment or finances--should probably have little bearing on how a message is treated in serious discussion.

Comment author: JGWeissman 14 May 2012 09:58:11PM 7 points [-]

don't you detect unacknowledged ambition in Eliezer Yudkowsky?

Eliezer certainly has a lot of ambition, but I am surprised to see an accusation that this ambition is unacknowledged.

Comment author: TheOtherDave 14 May 2012 10:10:51PM 5 points [-]

The impression I get from scanning their comment history is that metaphysicist means to suggest here that EY has ambitions he hasn't acknowledged (e.g., the ambition to make money without conventional credentials), not that he fails to acknowledge any of the ambitions he has.

Comment author: shminux 14 May 2012 10:10:22PM *  1 point [-]

I don't think it's a rational method to treat people differently, as inherently less rational, when they seem resentful.

Thank you for this analysis, it made me think more about my motivations and their validity. I believe that my decision to permanently disengage from discussions with some people is based on the futility of such discussions in the past, not on the specific reasons they are futile. At some point I simply decide to cut my losses.

There's actually good reason for the broader meaning of "ax to grind." Any special stake is a bias.

Indeed, present company not excluded. The question is whether it permanently prevents the ax-grinder from listening. EY, too, has his share of unacknowledged irrationalities, but both his status and his ability to listen and to provide insights makes engaging him in a discussion a rewarding, if sometimes frustrating experience.

I don't not know why srdiamond's need to bash SI is so entrenched, and whether it can be remedied to a degree where he is once again worth talking to, so at this point it is instrumentally rational for me to avoid replying to him.