Alicorn comments on Should LW have a public censorship policy? - Less Wrong

16 Post author: Bongo 11 December 2010 10:45PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (45)

You are viewing a single comment's thread. Show more comments above.

Comment author: Alicorn 12 December 2010 04:20:17AM *  16 points [-]

Outside view indicates that if you stare at the basilisk, you will most likely either a) think it was a terrible idea and wish you hadn't and maybe have nightmares, or b) wonder what the heck all the fuss is about and consider it a waste of time except insofar as you might consider censorship wrong in itself, and might thereby be tempted to share the basilisk with others, each of whom has an independent risk of suffering reaction (a).

Do you want to want to stare at the basilisk?

Comment author: JoshuaZ 12 December 2010 09:57:58PM 9 points [-]

You seem to be talking mainly in part (a) about the pseudo-basilisk rather than the basilisk itself. I suspect that most people who are vulnerable to the pseudo-basilisk are either mentally ill or vulnerable to having similar issues simply when thinking about the long-term implications of the second law of thermodynamics or the like. If one is strongly vulnerable to that sort of disturbing idea then between known laws of physics and nasty low probability claims made by some major religions, most basiliking of this sort is already well-covered.

Comment author: Kingreaper 12 December 2010 04:25:08PM *  22 points [-]

I'll add my opinion to the list:

I'm not an a, or a b.

Turns out, the basilisk was very close to one of the list of things I'd thought up, based on the nature of this community's elders, and gone "no, no, they wouldn't buy into that idea would they? No-one here would fall for that...".

Reading about it, combined with the knowledge that EY banned it, gives me an insight into EYs thought patterns that significantly decreases my respect for him. I think that that insight was worth the effort involved in reading it.

Comment author: Broggly 14 December 2010 05:46:00PM *  5 points [-]

Honestly I was suprised at EY's reaction. I thought he had figured out things like that problem and would tear it to pieces rather than become. Possibly I'm not as smart as him, but even presuming Roko's right you would think Rationalists Should Win. Plus, I think Eliezer has publicly published something similar to the Basilisk, albeit much weaker and without being explicitly basilisk like, so I'd have thought he would have worked out a solution. (EDIT: No, turns out it was someone else who came up with it. It wasn't really fleshed out so Eliezer may not have thought much of it or never noticed it in the first place.)

The fact that people are upset by it could be reason to hide it away, though, to protect the sensitive. Plus, having seen Dogma, I get that the post could be an existential risk...

Comment author: Kingreaper 14 December 2010 06:35:40PM 14 points [-]

The fact that people are upset by it could be reason to hide it away, though, to protect the sensitive.

I don't think hiding it will prevent people getting upset. In fact, hiding it may make people more likely to believe it, and thus get scared. If someone respects EY and EY says "this thing you've seen is a basilisk" then they're more likely to be scared than if EY says "this thing you've seen is nonsense"

Comment author: Vaniver 14 December 2010 06:46:31PM 8 points [-]

Plus, having seen Dogma, I get that the post could be an existential risk...

My understanding is that the post isn't the x-risk- a UFAI could think this up itself. The reaction to the post is supposedly an x-risk- if we let on we can be manipulated that way, then a UFAI can do extra harm.

But if you want to show that you won't be manipulated a certain way, it seems that the right way to do that is to tear that approach apart and demonstrate its silliness, not seek to erase it from the internet. I can't come up with a metric by which EY's approach is reasonable.

Comment author: wedrifid 14 December 2010 08:41:19PM *  3 points [-]

My understanding is that the post isn't the x-risk- a UFAI could think this up itself. The reaction to the post is supposedly an x-risk- if we let on we can be manipulated that way, then a UFAI can do extra harm.

(Concerns not necessarily limited to either existential or UFAI, but we cannot discuss that here.)

But if you want to show that you won't be manipulated a certain way, it seems that the right way to do that is to tear that approach apart and demonstrate its silliness, not seek to erase it from the internet. I can't come up with a metric by which EY's approach is reasonable.

Agree. :)

Comment author: Broggly 14 December 2010 07:58:31PM 1 point [-]

The reaction to the post is supposedly an x-risk

Yes, but not in the way you seem to be saying. I was semi-joking here, in that the post could spook people enough to increase x-risks (which wfg seems to be trying to do, albeit as blackmail rather than for its own sake). I was referring to how in the film Dogma gjb snyyra natryf, gb nibvq uryy, nggrzcg gb qrfgebl nyy ernyvgl. (rot13'd for spoilers, and in case it's too suggestive of the Basilisk)

if we let on we can be manipulated that way, then a UFAI can do extra harm.

It can? I suppose I just don't get decision theory. The non-basilisk part of that post left me pretty much baffled.

Comment author: topynate 12 December 2010 08:20:55PM 11 points [-]

"Do you want to know?" whispered the guide; a whisper nearly as loud as an ordinary voice, but not revealing the slightest hint of gender.

Brennan paused. The answer to the question seemed suspiciously, indeed extraordinarily obvious, even for ritual.

"Yes, provided that * * * ** * * * * * * * * ** * * ** * * * * ** * * * * **," Brennan said finally.

"Who told you to say that?", hissed the guide.

Comment author: FormallyknownasRoko 12 December 2010 08:44:26PM *  6 points [-]

Brennan is a fucking retard. No, you don't want to know. You want to signal affiliation with desirable groups, to send hard-to-fake signals of desirable presonality traits such as loyalty, intelligence, power and the presence of informed allies. You want to say everything bad you possibly can about the outgroup and everything good about the ingroup. You want to preech altruism and then make a plausible but unlikely reasoning error which conveniently stops you from having to give away anything costly.

All the other humans do all of these things. This is the true way of our kind. You will be punished if you deviate from the way, or even if you try to overtly mention that this is the way.

Comment author: katydee 17 December 2010 03:46:30AM *  8 points [-]

This may be the way now, but it doesn't have to be the way always.

Comment author: NihilCredo 12 December 2010 04:00:18PM *  3 points [-]

Minus the censorship part, that's not worse than watching Saw.

Comment author: Alicorn 12 December 2010 07:40:27PM 2 points [-]

One can receive partial-impact synopses of Saw without risking the full effect, and gauge their susceptibility with more information on hand.

Comment author: David_Gerard 12 December 2010 07:58:30PM 7 points [-]

There's a reason I've refrained from seeking out 2 Girls 1 Cup.

(I should stop bringing it into my mind, really.)

Comment author: NihilCredo 12 December 2010 08:01:03PM 2 points [-]

True. I think that after reading the debate(s) about the censored post one should have a pretty good idea of what it is, though.