You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Jiro comments on xkcd on the AI box experiment - Less Wrong Discussion

15 Post author: FiftyTwo 21 November 2014 08:26AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (229)

You are viewing a single comment's thread.

Comment author: Jiro 21 November 2014 10:54:41PM *  4 points [-]

It's hard to polish a turd. And I think all the people who have responded by saying that Eliezer's PR needs to be better are suggesting that he polish a turd. The basilisk and the way the basilisk was treated has implications about LW that are inherently negative, to the point where no amount of PR can fix it. The only way to fix it is for LW to treat the Basilisk differently.

I think that if Eliezer were to

  1. Allow free discussion of the basilisk and
  2. Deny that the basilisk or anything like it could actually put one in danger from advanced future intelligences,

people would stop seeing the basilisk as reflecting badly on LW. It might take some time to fade, but it would eventually go away. But Eliezer can't do that, because he does think that basilisk-like ideas can be dangerous, and this belief of his is feeding his inability to really deny the Basilisk.

Comment author: JoshuaFox 22 November 2014 09:12:49PM 3 points [-]

And (3) explain why other potential info hazards, not the basilisk but very different configurations of acausal negotation (that have either not yet discovered, or were discovered but they not made public), should not be discussed.

Comment author: RichardKennaway 23 November 2014 05:43:22PM 1 point [-]

But Eliezer can't do that, because he does think that basilisk-like ideas can be dangerous, and this belief of his is feeding his inability to really deny the Basilisk.

In other words, he disagrees with you and that is preventing him from agreeing with you.

Comment author: Jiro 24 November 2014 01:35:34AM 2 points [-]

Yes, except that agreeing with me is what a lot of people take Eliezer to be saying. There's this widespread belief that Eliezer just denied the Basilisk. And that's not really true; he denied the exact version of the Basilisk that was causing trouble, but he acceps the Basilisk in principle.

Comment author: TobyBartels 23 November 2014 11:45:58PM 0 points [-]

The basilisk and the way the basilisk was treated has implications about LW that are inherently negative, to the point where no amount of PR can fix it.

This is true; nevertheless, good PR should still make things as least bad as possible. And indeed, you go on to make a suggestion as to how to do that (not even a bad one in my opinion).

Comment author: AlexMennen 22 November 2014 12:12:39AM 0 points [-]

Eliezer has done (2) many times.

Comment author: Jiro 23 November 2014 03:15:19AM *  6 points [-]

Eliezer has denied that the exact Basilisk scenario is a danger, but not that anything like it can be a danger. He seems to think that discussing acausal trade with future AIs can be dangerous enough that we shouldn't talk about the details.

Comment author: V_V 22 November 2014 01:28:10AM 10 points [-]

Doing 2 without doing 1 looks insincere.

Comment author: Eliezer_Yudkowsky 23 November 2014 02:05:57AM 8 points [-]

This post is still here, isn't it?

Comment author: ChristianKl 23 November 2014 03:09:39PM 7 points [-]

If I remember right, earlier this year a few posts did disappear.

I'm also not aware of any explicit withdrawal of the previous policy.

Comment author: TobyBartels 23 November 2014 11:42:50PM 3 points [-]

We conclude that free discussion is now allowed, so maybe all that's really missing is putting that up explicitly somewhere that can be linked to?

Comment author: Eliezer_Yudkowsky 24 November 2014 02:04:12AM 0 points [-]

Not especially. This post is still here because I'm feeling too lethargic to delete it, but the /r/xkcd moderator deleted most of the basilisk discussion on their recent thread because it violated their Rule 3, "Be Nice". This is a fine upstanding policy, and I fully agree with it. If there's one thing we can deduce about the motives of future superintelligences, it's that they simulate people who talk about Roko's Basilisk and condemn them to an eternity of forum posts about Roko's Basilisk. So far as official policy goes, go talk about it somewhere else. But in this special case I won't ban any RB discussion such that /r/xkcd would allow it to occur there. Sounds fair to me.

Comment author: shminux 23 November 2014 02:49:28AM 2 points [-]

Are you implying that the basilisk discussion is somehow censored on this forum?

Comment author: V_V 23 November 2014 12:03:30PM *  5 points [-]

It doesn't appear to be censored in this thread, but it was historically censored on LessWrong. Maybe EY finally understood the Streisand effect.

Comment author: Rukifellth 23 November 2014 04:03:32PM -1 points [-]

He might do it less for the "danger" and more for "bad discussion". The threads I see on /sci/ raising questions about high IQ come to mind.

Well, most threads I see on /sci/ come to mind.

Comment author: V_V 23 November 2014 06:01:54PM 3 points [-]

I don't read /sci/ therefore I don't understand what you mean.

Comment author: Rukifellth 25 November 2014 02:09:31AM 0 points [-]

Do you know of it?

Comment author: V_V 25 November 2014 02:01:09PM 1 point [-]

No, I've just found out that it is a board on 4chan.

Comment author: somnicule 26 November 2014 08:13:00AM 0 points [-]

Typical low-moderation problems. Repeated discussions of contentious but played-out issues like religion, IQ, status of various fields, etc. The basilisk is an infohazard in that sense at this point, IMO. It's fun to argue about, to the point of displacing other worthwhile discussion.