MarkusRamikin comments on Thoughts on the Singularity Institute (SI) - Less Wrong

256 Post author: HoldenKarnofsky 11 May 2012 04:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1270)

You are viewing a single comment's thread. Show more comments above.

Comment author: Richard_Loosemore 10 May 2012 07:11:15PM 1 point [-]

Holden, I think your assessment is accurate ... but I would venture to say that it does not go far enough.

My own experience with SI, and my background, might be relevant here. I am a member of the Math/Physical Science faculty at Wells College, in Upstate NY. I also have had a parallel career as a cognitive scientist/AI researcher, with several publications in the AGI field, including the opening chapter (coauthored with Ben Goertzel) in a forthcoming Springer book about the Singularity.

I have long complained about SI's narrow and obsessive focus on the "utility function" aspect of AI -- simply put, SI assumes that future superintelligent systems will be driven by certain classes of mechanism that are still only theoretical, and which are very likely to be superceded by other kinds of mechanism that have very different properties. Even worse, the "utility function" mechanism favored by SI is quite likely to be so unstable that it will never allow an AI to achieve any kind of human-level intelligence, never mind the kind of superintelligence that would be threatening.

Perhaps most important of all, though, is the fact that the alternative motivation mechanism might (and notice that I am being cautious here: might) lead to systems that are extremely stable. Which means both friendly and safe.

Taken in isolation, these thoughts and arguments might amount to nothing more than a minor addition to the points that you make above. However, my experience with SI is that when I tried to raise these concerns back in 2005/2006 I was subjected to a series of attacks that culminated in a tirade of slanderous denunciations from the founder of SI, Eliezer Yudkowsky. After delivering this tirade, Yudkowsky then banned me from the discussion forum that he controlled, and instructed others on that forum that discussion about me was henceforth forbidden.

Since that time I have found that when I partake in discussions on AGI topics in a context where SI supporters are present, I am frequently subjected to abusive personal attacks in which reference is made to Yudkowsky's earlier outburst. This activity is now so common that when I occasionally post comments here, my remarks are very quickly voted down below a threshold that makes them virtually invisible. (A fate that will probably apply immediately to this very comment).

I would say that, far from deserving support, SI should be considered a cult-like community in which dissent is ruthlessly suppressed in order to exaggerate the point of view of SI's founders and controllers, regardless of the scientific merits of those views, or of the dissenting opinions.

Comment author: MarkusRamikin 10 May 2012 08:02:54PM *  4 points [-]

However, my experience with SI is that when I tried to raise these concerns back in 2005/2006 I was subjected to a series of attacks that culminated in a tirade of slanderous denunciations from the founder of SI, Eliezer Yudkowsky.

I am frequently subjected to abusive personal attacks in which reference is made to Yudkowsky's earlier outburst

Link to the juicy details cough I mean evidence?

Comment author: Oscar_Cunningham 10 May 2012 08:17:39PM 10 points [-]
Comment author: ChrisHallquist 11 May 2012 04:16:45AM 3 points [-]

As someone who was previously totally unaware of that flap, that doesn't sound to me like a "slanderous tirade." Maybe Loosemore would care to explain what he thought was slanderous about it?

Comment author: Richard_Loosemore 10 May 2012 08:35:27PM -1 points [-]

Markus: Happy to link to the details, but where in the huge stream would you like to be linked to? The problem is that opinions can be sharply skewed by choosing to link to only selected items.

I cite as evidence Oscar's choice, below, to link to a post by EY. In that post he makes a series of statements that are flagrant untruths. If you read that particular link, and take his word as trustworthy, you get one impression.

But if you knew that EY had to remove several quotes from their context and present them in a deceiptful manner, in order to claim that I said things that I did not, you might get a very different impression.

You might also get a different impression if you knew this. The comment that Oscar cites came shortly after I offered to submit the dispute to outside arbitration by an expert in the field we were discussing. I offered that ANYONE could propose an outside expert, and I would abide by their opinion.

It was only at that point that EY suddenly wrote the post that Oscar just referenced, in which he declared me to be banished from the list and (a short time later) that all discussion about the topic should cease.

That fact by itself speaks volumes.

Comment author: Multiheaded 10 May 2012 08:42:08PM 5 points [-]

I've read SL4 around that time and saw the whole drama (although I couldn't understand all the exact technical details, being 16). My prior on EY flagrantly lying like that is incredibly low. I'm virtually certain that you're quite cranky in this regard.

Comment author: gwern 10 May 2012 09:07:49PM 13 points [-]

I was on SL4 as well, and regarded Eliezer as basically correct, although I thought Loosemore's ban was more than a little bit disproportionate. (If John Clark didn't get banned for repeatedly and willfully misunderstanding Godelian arguments, wasting the time of countless posters over many years, why should Loosemore be banned for backtracking on some heuristics & biases positions?)

Comment author: Eliezer_Yudkowsky 11 May 2012 01:58:22AM 8 points [-]

(Because JKC never lied about his credentials, which is where it really crosses the line into trolling.)

Comment author: [deleted] 13 May 2012 12:36:45AM *  5 points [-]

Because JKC never lied about his credentials...

The animus here must be really strong. What Yudkowsky did was infer that Loosemore was lying about being a cognitive scientist from his ignorance of a variant of the Wasson experiment. First, people often forget obvious things in heated online discussions. Second, there are plenty of incompetent cognitive scientists: if Loosemore intended to deceive, he probably wouldn't have expressly stated that he didn't have teaching responsibilities for graduate students.

Comment author: Will_Newsome 13 May 2012 01:31:16AM *  3 points [-]

If what you say is true, then Eliezer is lying about Loosemore lying about his credentials, in which case Eliezer is "trolling". But if what you say is false, then you are the "troll".

(This comment is an attempt to convincingly demonstrate that Eliezer's notion of trolling is, to put it bluntly, both harmful and dumb.)

Comment author: [deleted] 13 May 2012 02:59:31AM 4 points [-]

If what you say is true, then Eliezer is lying about Loosemore lying about his credentials, in which case Eliezer is "trolling". But if what you say is false, then you are the "troll". (This comment is an attempt to convincingly demonstrate that Eliezer's notion of trolling is, to put it bluntly, both harmful and dumb.)

I don't know about you, but I'd prefer to be considered a troll than a liar; correspondingly, I think the expanded definition of liar is worse than the inaccurate definition of troll. Not every inaccuracy amounts to dishonesty and not all dishonesty to prevarication.

Comment author: Will_Newsome 13 May 2012 12:00:22AM 11 points [-]

trolling

You use this word in an unconventional way, i.e., you use it to mean something like 'unfairly causing harm and wasting people's time', which is not the standard definition: the standard definition necessitates intention to provoke or at least something in that vein. (I assume you know what "trolling" means in the context of fishing?) Because it's only ever used in sensitive contexts, you might want to put effort into finding a more accurate word or phrase. As User:Eugine_Nier noted, lately "troll" and "trolling" have taken on a common usage similar to "fascist" and "fascism", which I think is an unfortunate turn of events.

Comment author: MarkusRamikin 10 May 2012 08:45:22PM *  9 points [-]

I'll gladly start reading at any point you'll link me to.

The fact that you don't just provide a useful link but instead several paragraphs of excuses why the stuff I'm reading is untrustworthy I count as (small) evidence against you.

Comment author: woodchuck64 10 May 2012 08:37:32PM -1 points [-]

I strongly suspect the rationality of the internet would improve many orders of magnitude if all arguments about arguments were quietly deleted.

Comment author: woodchuck64 10 May 2012 08:46:35PM *  0 points [-]

Okay, make that: I strongly suspect the rationality of the rational internet would improve many orders of magnitude if all arguments about arguments were quietly deleted

Comment author: khafra 11 May 2012 04:53:39PM *  3 points [-]

Every time I try to think about that, I end up thinking about logical paradoxes instead.

edit for less subtlety in reponse to unexplained downvote: That argument is self-refuting.