wedrifid comments on The Singularity Institute's Arrogance Problem - Less Wrong

63 Post author: lukeprog 18 January 2012 10:30PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (307)

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 20 January 2012 10:46:56AM *  34 points [-]

I can smell the "arrogance," but do you think any of the claims in these paragraphs is false?

I am the wrong person to ask if a "a doctorate in AI would be negatively useful". I guess it is technically useful. And I am pretty sure that it is wrong to say that others are "not remotely close to the rationality standards of Less Wrong". That's of course the case for most humans, but I think that there are quite a few people out there who are at least at the same level. I further think that it is quite funny to criticize people on whose work your arguments for risks from AI are dependent on.

But that's besides the point. Those statements are clearly false when it comes to public relations.

If you want to win in this world, as a human being, you are either smart enough to be able to overpower everyone else or you actually have to get involved in some fair amount of social engineering, signaling games and need to refine your public relations.

Are you able to solve friendly AI, without much more money, without hiring top-notch mathematicians, and then solve general intelligence to implement it and take over the world? If not, then you will at some point either need much more money or convince actual academics to work for you for free. And, most importantly, if you don't think that you will be the first to invent AGI, then you need to talk to a lot of academics, companies and probably politicians to convince them that there is a real risk and that they need to implement your friendly AI theorem.

It is of topmost importance to have an academic degree and reputation to make people listen to you. Because at some point it won't be enough to say, "I am a research fellow of the Singularity Institute who wrote a lot about rationality and cognitive biases and you are not remotely close to our rationality standards." Because at the point that you utter the word "Singularity" you have already lost. The very name of your charity already shows that you underestimate the importance of signaling.

Do you think IBM, Apple or DARPA care about a blog and a popular fanfic? Do you think that you can even talk to DARPA without first getting involved in some amount of politics, making powerful people aware of the risks? And do you think you can talk to them as a "research fellow of the Singularity Institute"? If you are lucky then they might ask someone from their staff about you. And if you are really lucky then they will say that you are for the most part well-meaning and thoughtful individuals who never quite grew out of their science-fiction addiction as adolescents (I didn't write that line myself, it's actually from an email conversation with a top-notch person that didn't give me their permission to publish it). In any case, you won't make them listen to you, let alone do what you want.

Compare the following:

Eliezer Yudkowsky, research fellow of the Singularity Institute.

Education: -

Professional Experience: -

Awards and Honors: A lot of karma on lesswrong and many people like his Harry Potter fanfiction.

vs.

Eliezer Yudkowsky, chief of research at the Institute for AI Ethics.

Education: He holds three degrees from the Massachusetts Institute of Technology: a Ph.D in mathematics, a BS in electrical engineering and computer science, and an MS in physics and computer science.

Professional Experience: He worked on various projects with renowned people making genuine insights. He is the author of numerous studies and papers.

Awards and Honors: He holds various awards and is listed in the Who's Who in computer science.

Who are people going to listen to? Well, okay...the first Eliezer might receive a lot of karma on lesswrong, the other doesn't have enough time for that.

Another problem is how you handle people who disagree with you and who you think are wrong. Concepts like "Well-Kept Gardens Die By Pacifism" will at some point explode in your face. I have chatted with a lot of people who left lesswrong and who portray lesswrong/SI negatively. And the number of those people is growing. Many won't even participate here because members are unwilling to talk to them in a charitable way. That kind of behavior causes them to group together against you. Well-kept gardens die by pacifism, others are poisoned by negative karma. A much better rule would be to keep your friends close and your enemies closer.

Think about it. Imagine how easy it would have been for me to cause serious damage to SI and the idea of risks from AI by writing different kinds of emails.

Why does that rational wiki entry about lesswrong exist? You are just lucky that they are the only people who really care about lesswrong/SI. What do you think will happen if you continue to act like you do and real experts feel uncomfortable about your statements or even threatened? It just takes one top-notch person, who becomes seriously bothered, to damage your reputation permanently.

Comment author: wedrifid 20 January 2012 11:20:36AM 3 points [-]

Concepts like "Well-Kept Gardens Die By Pacifism" will at some point explode in your face.

Counterprediction: The optimal degree of implementation of that policy for the purpose of PR maximisation is somewhat higher than it currently is.

You don't secure an ideal public image by being gentle.

Comment author: XiXiDu 20 January 2012 12:39:18PM *  5 points [-]

You don't secure an ideal public image by being gentle.

Don't start a war if you don't expect to be able to win it. It is much easier to damage a reputation than to build one, especially if you support a cause that can easily trigger the absurdity heuristic in third-party people.

Being rude to people who don't get it will just cause them to reinforce their opinion and tell everyone that you are wrong instead. Which will work, because your arguments are complex and in support of something that sounds a lot like science fiction.

A better route is to just ignore them, if you are not willing to discuss the matter over, or to explain how exactly they are wrong. And if you consider both routes to be undesirable, then do it like FHI and don't host a public forum.

Comment author: wedrifid 20 January 2012 01:07:24PM 3 points [-]

Being rude to people

Being gratuitously rude to people isn't the point. 'Maintaining a garden' for the purpose of optimal PR involves far more targeted and ruthless intervention. "Weeds" (those who are likely to try sabotage your reputation, otherwise interfere with your goals, or significantly provoke 'rudeness' from others) are removed early before they have a chance to take root.