CellBioGuy comments on [LINK] Another "LessWrongers are crazy" article - this time on Slate - Less Wrong

9 Post author: CronoDAS 18 July 2014 04:57AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (129)

You are viewing a single comment's thread.

Comment author: CellBioGuy 18 July 2014 05:13:19AM *  31 points [-]

Is there anything we should do?

Laugh, as the entire concept (and especially the entire reaction to it by Eliezer and people who take the 'memetic hazard' thing seriously) is and always has been laughable. It's certainly given my ab muscles a workout every now and then over the last three years... maybe with more people getting to see it and getting that exercise it'll be a net good! God, the effort I had to go through to dig through comment threads and find that google cache...

This is also such a delicious example of the Streisand effect...

Comment author: polymathwannabe 18 July 2014 03:17:27PM 3 points [-]

This is also such a delicious example of the Streisand effect...

I was going to say precisely that. In the end, banning Roko's post was pointless and ineffectual: anyone with internet access can learn about the "dangerous" idea. Furthermore, it's still being debated here, in a LessWrong thread. Is any of you having nightmares yet?

Comment author: Viliam_Bur 19 July 2014 08:57:10AM 6 points [-]

On the other hand, does not banning these debates contribute to having less of them? Doesn't seem so. We already had a dozen of them, and we are going to have more, and more, and more...

I can't know what happened in the parallel Everett branch where Eliezer didn't delete that comment... but I wouldn't be too surprised if the exposure of the basilisk was pretty much the same -- without the complaining about censorship, but perhaps with more of "this is what people on LW actually believe, here is the link to prove it".

I think this topic is debated mostly because it's the clever contrarian thing to do. You have a website dedicated to rationality and artificial intelligence where people claim to care about humanity? Then you get contrarian points for inventing clever scenarios of how using rationality will actually make things go horribly wrong. It's too much fun to resist. (Please note that this motivated inventing of clever horror scenarios is different from predicting actual risks. Finding actual risks is a useful thing to do. Inventing dangers that exist only because you invented and popularized them, not very useful.)

Comment author: Jiro 20 July 2014 03:35:41AM 2 points [-]

On the other hand, does not banning these debates contribute to having less of them?

The debates are not technically, banned, but there are still strict limits on what we're allowed to say. We cannot, for instance, have an actual discussion about why the basilisk wouldn't work.

Furthermore, there are aspects other than the ban that make LW look bad. Just the fact that people fall for the basilisk makes LW look bad all by itself. You could argue that the people who fall for the basilisk are mentally unstable, but having too many mentally unstable people or being too willing to limit normal people for the sake of mentally unstable people makes us look bad too. Ultimately, the problem is that "looking bad" happens because there are aspects of LW that people consider to be bad. It's not just a public relations problem--the basilisk demonstrates a lack of rationality on LW and the only way to fix the bad perception is to fix the lack of rationality.

Comment author: David_Gerard 20 July 2014 02:26:17PM *  2 points [-]

One of the problems is that the basilisk is very weird, but the prerequisites - which are mostly straight out of the Sequences - are also individually weird. So explaining the basilisk to people who haven't read the Sequences through a few times and haven't been reading LessWrong for years is ... a bit of work.

Comment author: Jiro 21 July 2014 07:58:53PM 0 points [-]

Presumably, you don't believe the basilisk would work.

If you don't believe the basilisk would work, then it really doesn't matter all that much that people don't understand the prerequisites. After all, even understanding the prerequisites won't change their opinion of whether the basilisk is correct. (I suppose that understanding the sequences may change the degree of incorrectness--going from crazy and illogical to just normally illogical--but I've yet to see anyone argue this.)

Comment author: David_Gerard 22 July 2014 10:17:51AM 1 point [-]

Are you saying it's meaningless to tell someone about the prerequisites - which, as I note, are pretty much straight out of the Sequences - unless they think the basilisk would work?

Comment author: Jiro 22 July 2014 03:37:49PM 1 point [-]

It's not meaningless in general, but it's meaningless for the purpose of deciding that they shouldn't see the basilisk because they'd misunderstand it. They don't misunderstand it--they know that it's false, and if they read the sequences they'd still know that it's false.

As I pointed out, you could still argue that they'd misunderstand the degree to which the basilisk is false, but I've yet to see anyone argue that.

Comment author: roystgnr 18 July 2014 06:46:43PM 4 points [-]

I've said precisely this in the past, but now I'm starting to second-guess myself. Sure, if we're worried about basilisk-as-memetic-hazard then deletion was an obvious mistake... but are any of you having nightmares yet? I'm guessing "no", in which case we're left with basilisk-as-publicity-stunt, which might actually be beneficial.

I wouldn't personally have advocated recruiting rationalists via a tactic like "Get a bunch of places to report on how crazy you are, then anyone who doesn't believe everything they read will be pleasantly surprised", but I'm probably the last person to ask about publicity strategy. I also would have disapproved of "Write Harry Potter fan fiction and see who wants to dig through the footnotes", without benefit of hindsight.

Comment author: Will_Newsome 18 July 2014 10:41:15AM 19 points [-]

This is also such a delicious example of the Streisand effect...

<puts on Quirrell hat> Yes, Eliezer's Streisanding is almost suspiciously delicious. One begins to wonder if he is in thrall to... well, perhaps it is best not to speculate here, lest we feed the Adversary.

Comment author: Dahlen 06 August 2014 05:35:43AM 2 points [-]

<puts on Quirrell turban>

There.

Comment author: Kawoomba 18 July 2014 10:46:51AM 8 points [-]

Hanlon's Beard.

Comment author: Viliam_Bur 18 July 2014 08:31:26AM *  16 points [-]

I think this is also a delicious example of how easy it is to troll LessWrong readers. Do you want to have an LW article and a debate about you? Post an article about how LW is a cult or about Roko's basilisk. Success 100% guaranteed.

Think about the incentives this gives to people who make their money by displaying ads on their websites. The only way we could motivate them more would be to pay them directly for posting that shit.

Comment author: Toggle 18 July 2014 04:02:22PM 12 points [-]

This isn't a particularly noteworthy attribute of Less Wrong discussion; most groups below a certain size will find it interesting when a major media outlet talks about them. I'm sure that the excellent people over at http://www.clown-forum.com/ would be just as chatty if they got an article.

I suppose you could say that it gives journalists an incentive to write about the groups below that population threshold that are likely to generate general interest among the larger set of readers. But that's just the trivial result in which we have invented the human interest story.