XiXiDu comments on Existential Risk and Public Relations - Less Wrong

36 Post author: multifoliaterose 15 August 2010 07:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (613)

You are viewing a single comment's thread. Show more comments above.

Comment author: michaelkeenan 15 August 2010 09:30:50AM 7 points [-]

During graduate school I've met many smart people who I wish would take existential risk more seriously. Most such people who have heard of Eliezer do not find his claims credible. My understanding is that the reason for this is that Eliezer has made some claims which they perceive to be falling under the above rubric, and the strength of their negative reaction to these has tarnished their mental image of all of Eliezer's claims.

Can you tell us more about how you've seen people react to Yudkowsky? That these negative reactions are significant is crucial to your proposal, but I have rarely seen negative reactions to Yudkowsky (and never in person) so my first availability-heuristic-naive reaction is to think it isn't a problem. But I realize my experience may be atypical and there could be an abundance of avoidable Yudkowsky-hatred where I'm not looking, so would like to know more about that.

Did that objectionable Yudkowsky-meteorite comment get widely disseminated? YouTube says the video has only 500 views, and I imagine most of those are from Yudkowsky-sympathizing Less Wrong readers.

Comment author: XiXiDu 15 August 2010 01:32:06PM *  16 points [-]

Negative reactions to Yudkowsky from various people (academics concerned with x-risk), just within the past few weeks:

I also have an extreme distaste for Eliezer Yudkowsky, and so I have a hard time forcing myself to cooperate with any organization that he is included in, but that is a personal matter.

You know, maybe I'm not all that interested in any sort of relationship with SIAI after all if this, and Yudkowsky, are the best you have to offer.

...

There are certainly many reasons to doubt the belief system of a cult based around the haphazard musings of a high school dropout, who has never written a single computer program but professes to be an expert on AI. As you point out none of the real AI experts are crying chicken little, and only a handful of AI researchers, cognitive scientists or philosophers take the FAI idea seriously.

...

Wow, that's an incredibly arrogant put-down by Eliezer..SIAI won't win many friends if he puts things like that...

...

...he seems to have lost his mind and written out of strong feelings. I disagree with him on most of these matters.

...

Questions of priority - and the relative intensity of suffering between members of different species - need to be distinguished from the question of whether other sentient beings have moral status at all. I guess that was what shocked me about Eliezer's bald assertion that frogs have no moral status. After all, humans may be less sentient than frogs compared to our posthuman successors. So it's unsettling to think that posthumans might give simple-minded humans the same level of moral consideration that Elizeer accords frogs.

I was told that the quotes above state some ad hominem falsehoods regarding Eliezer. I think it is appropriate to edit the message to show that indeed some person might not be have been honest, or clueful. Otherwise I'll unnecessary end up perpetuating possible ad hominem attacks.

Comment author: Eliezer_Yudkowsky 18 August 2010 03:06:41PM 15 points [-]

who has never written a single computer program

utterly false, wrote my first one at age 5 or 6, in BASIC on a ZX-81 with 4K of RAM

The fact that a lot of these reactions are based on false info is worth noting. It doesn't defeat any arguments directly, but it says that the naive model where everything happens because of the direct perception of actions I directly control is false.

Comment author: timtyler 18 August 2010 03:13:06PM *  0 points [-]

That sounds like a pretty rare device! Most ZX81 models had either 1K or 16K of RAM. 32 KB and 64 KB expansion packs were eventually released too.

Comment author: NancyLebovitz 15 August 2010 10:27:03PM 8 points [-]

Is it likely that someone who's doing interesting work that's publicly available wouldn't attract some hostility?

Comment author: Jonathan_Graehl 16 August 2010 10:10:25PM 3 points [-]

I guess that was what shocked me about Eliezer's bald assertion that frogs have no moral status.

This seems a rather minor objection.

Comment author: Emile 18 August 2010 03:26:20PM 6 points [-]

But frogs are CUTE!

And existential risks are boring, and only interest Sci-Fi nerds.

Comment author: Vladimir_Nesov 15 August 2010 01:43:23PM 6 points [-]

That N negative reactions about issue S exist only means that issue S is sufficiently popular.

Comment author: CarlShulman 15 August 2010 01:53:58PM 5 points [-]

Not if the polling is of folk in a position to have had contact with S, or is representative.

Comment author: Vladimir_Nesov 15 August 2010 02:03:34PM 3 points [-]

Sure, but XiXiDu's quotes bear no such framing.

Comment author: XiXiDu 15 August 2010 01:55:44PM 5 points [-]

I don't like to, but if necessary I can provide the indentity of the people who stated the above. They all directly work to reduce x-risks. I won't do so in public however.

Comment author: Vladimir_Nesov 15 August 2010 02:05:02PM *  4 points [-]

Identity of these people is not the issue. The percentage of people in given category that have negative reactions for given reason, negative reactions for other reason, and positive reactions would be useful, but not a bunch of filtered (in unknown way) soldier-arguments.

Comment author: XiXiDu 15 August 2010 02:12:21PM *  6 points [-]

I know. I however just wanted to highlight that there are negative reactions, including not so negative critique. If you look further, you'll probably find more. I haven't saved all I saw over the years, I just wanted to show that it's not like nobody has a problem with EY. And in all ocassion I actually defended him by the way.

The context is also difficult to provide as some of it is from private e-Mails. Although the first one is from here and after thinking about it I can also provide the name since he was anyway telling this Michael Anissimov. It is from Sean Hays:

Sean A Hays PhD Post Doctoral Fellow, Center for Nanotechnology in Society at ASU Research Associate, ASU-NAF-Slate Magazine "Future Tense" Initiative Program Director, IEET Securing the Future Program

Comment author: Rain 18 August 2010 04:48:56PM *  4 points [-]

You have a 'nasty things people say about Eliezer' quotes file?

Comment author: timtyler 15 August 2010 02:04:55PM *  0 points [-]

The last one was from David Pearce.