timtyler comments on Existential Risk and Public Relations - Less Wrong

36 Post author: multifoliaterose 15 August 2010 07:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (613)

You are viewing a single comment's thread. Show more comments above.

Comment author: multifoliaterose 15 August 2010 10:53:19AM 13 points [-]

Can you tell us more about how you've seen people react to Yudkowsky? That these negative reactions are significant is crucial to your proposal, but I have rarely seen negative reactions to Yudkowsky (and never in person) so my first availability-heuristic-naive reaction is to think it isn't a problem. But I realize my experience may be atypical and there could be an abundance of avoidable Yudkowsky-hatred where I'm not looking, so would like to know more about that.

I haven't seen examples of Yudkowsky-hatred. But I have regularly seen people ridicule him. Recalling Hanson's view that a lot human behavior is really signaling and vying for status, I interpret this ridicule as an functioning to lower Eliezer's status to compensate for what people perceive as inappropriate status grubbing on his part.

Most of the smart people who I know (including myself) perceive him as exhibiting a high degree of overconfidence in the validity of his views about the world.

This leads some of them conceptualize him as a laughingstock; as somebody who's totally oblivious and feel that the idea that we should be thinking about artificial intelligence is equally worthy of ridicule. I personally am quite uncomfortable with these attitudes, agreeing with Holden Karnofsky's comment

"I believe that there are enormous risks and upsides associated with artificial intelligence. Managing these deserves serious discussion, and it’s a shame that many laugh off such discussion."

I'm somewhat surprised that you appear not to have notice this sort of thing independently. Maybe we hang out in rather different crowds.

Did that objectionable Yudkowsky-meteorite comment get widely disseminated? YouTube says the video has only 500 views, and I imagine most of those are from Yudkowsky-sympathizing Less Wrong readers.

Yes, I think that you're right. I just picked it out as a very concrete example of a statement that could provoke a substantial negative reaction. There are other qualitatively similar things (but more mild) things that Eliezer has said that have been more widely disseminated.

Comment author: timtyler 15 August 2010 11:25:51AM *  1 point [-]

Recalling Hanson's view that a lot human behavior is really signaling and vying for status

Existential risk reduction too! Charities are mostly used for signalling purposes - and to display affiliations and interests. Those caught up in causes use them for social networking with like-minded individuals - to signal how much they care, to signal how much spare time and energy they have - and so on. The actual cause is usually not irrelevant - but it is not particularly central either. It doesn't make much sense to expect individuals to be actually attempting to SAVE THE WORLD! This is much more likely to be a signalling phenomenon, making use of a superstimulus for viral purposes.