Sysice comments on xkcd on the AI box experiment - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (229)
It might be useful to feature a page containing what we, you know, actually think about the basilisk idea. Although the rationalwiki page seems to be pretty solidly on top of google search, we might catch a couple people looking for the source.
If any XKCD readers are here: Welcome! I assume you've already googled what "Roko's Basilisk" is. For a better idea of what's going on with this idea, see Eliezer's comment on the xkcd thread (linked in Emile's comment), or his earlier response here.
Because of Eliezer's reaction, probably a hundred more people have heard of the Basilisk, and it tars LW's reputation.
And this wasn't particularly unforseeable - see Streisand Effect.
Part of rationality is about regarding one's actions as instrumental.
He mucked that one up. But to be fair to him, it's because he takes these ideas very seriously. I don't care about the basilisk because I don't take elaborate TDT-based reasoning too seriously, partially out of ironic detachment, but many here would say I should.
Righto, you should avoid not taking things seriously because of ironic detachment.
That explanation by Eliezer cleared things up for me. He really should have explained himself earlier. I actually had some vague understanding of what Eliezer was doing with his deletion and refusal to discuss the topic, but as usual, Eliezer's explanation make things that I thought I sort-of-knew seem obvious in retrospect.
And as Eliezer realizes, the attempt to hush things up was a mistake. Roko's post should have been taken as a teaching moment.
Exactly. Having the official position buried in comments with long chains of references doesn't help to sound convincing compared to a well-formatted (even if misleading) article.
For a better idea of what's going on you should read all of his comments on the topic in chronological order.
I'm guessing Eliezer has one of those, probably locked away behind a triply-locked vault in the basement of MIRI.
See, it's comments like these that are one of the reasons people think LW is a cult.
Does MIRI actually has a basement?
It's behind the hidden door. Full of boxes which say "AI inside -- DO NOT TALK TO IT".
The ghosts there are not really dangerous. Usually.
When I visited MIRI's headquarters, they were trying to set up a video link to the Future of Humanity Institute. Somebody had put up a monitor in a prominent place and there was a sticky note saying something like "Connects to FHI - do not touch".
Except that the H was kind of sloppy and bent upward so it looked like an A.
I was really careful not to touch that monitor.
That response in /r/futurology is really good actually, I hadn't seen it before. Maybe it should be reposted (with the sarcasm slightly toned down) as a main article here?
Also kudos to Eleizer for admitting he messed up with the original deletion.