You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Yvain comments on xkcd on the AI box experiment - Less Wrong Discussion

15 Post author: FiftyTwo 21 November 2014 08:26AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (229)

You are viewing a single comment's thread. Show more comments above.

Comment author: Yvain 22 November 2014 02:39:54AM *  51 points [-]

At this point I think the winning move is rolling with it and selling little plush basilisks as a MIRI fundraiser. It's our involuntary mascot, and we might as well 'reclaim' it in the social justice sense.

Then every time someone brings up "Less Wrong is terrified of the basilisk" we can just be like "Yes! Yes we are! Would you like to buy a plush one?" and everyone will appreciate our ability to laugh at ourselves, and they'll go back to whatever they were doing.

Comment author: ThisSpaceAvailable 26 November 2014 09:03:35AM *  3 points [-]

"selling little plush basilisks as a MIRI fundraiser."

By "selling", do you mean giving basilisks to people who give money? It seems like a more appropriate policy would be giving a plush basilisk to anyone who doesn't give money.

Comment author: Lumifer 26 November 2014 04:06:48PM 3 points [-]

It seems like a more appropriate policy would be giving a plush basilisk to anyone who doesn't give money.

Sound like the first step a Plush Basilisk Maximizer would take... :-D

Comment author: Ishaan 23 November 2014 10:57:47AM *  0 points [-]

It should be a snake, only with little flashing LEDs in its eyes.

The canonical basilisk paralyzes you if you look at it. Flickering lights carry the danger of triggering photosensitive epilepsy, and thus are sort of real-life basilisks. Even if the epilepsy reference is lost on many, it's still clearly a giant snake thing with weird eyes and importantly you can probably get from somewhere without having to custom make them.

(AFAIK Little LEDs should be too small to actually represent a threat to epileptics, and it shouldn't be any worse than any of the other flickering lights.)

EDIT: Eh, I suppose it could also be stuffed with paperclips or something, if we want to pack as many memes in as possible.

Comment author: TheMajor 23 November 2014 09:15:17AM 0 points [-]

Yes, brilliant idea!

Comment author: Error 22 November 2014 10:05:28PM 1 point [-]

I'd buy this. We can always use more stuffies.

Comment author: chaosmage 22 November 2014 11:08:24AM 7 points [-]

Excellent idea. I would buy that, especially if it has a really bizarre design.

I'd like merchandise-based tribal allegiance membership signalling items anyway. Anyone selling MIRI mugs or LessWrong T-shirts can expect money from me.

Comment author: Locaha 22 November 2014 11:03:27AM -1 points [-]

We can save money by re-coloring the plush Cthulhu. It's basically the same, right? :-)

Comment author: FiftyTwo 23 November 2014 12:09:23PM 4 points [-]

alternatively sell empty boxes labelled "Don't look!"

Comment author: Tenoke 22 November 2014 10:41:35AM 9 points [-]

Blasphemy, our mascot is a paperclip.

Comment author: Error 22 November 2014 10:06:08PM 2 points [-]

I feel the need to switch from Nerd Mode to Dork Mode and ask:

Which would win in a fight, a basilisk or a paperclip maximizer?

Comment author: Dallas 22 November 2014 11:21:38PM 0 points [-]

Paperclip maximizer, obviously. Basilisks typically are static entities, and I'm not sure how you would go about making a credible anti-paperclip 'infohazard'.

Comment author: ThisSpaceAvailable 26 November 2014 09:00:20AM 3 points [-]

That depends entirely on what the PM's code is. If it doesn't include input sanitizers, a buffer overflow attack could suffice as a basilisk. If your model of a PM basilisk is "Something that would constitute a logical argument that would harm a PM", then you're operating on a very limited understanding of basilisks.

Comment author: lmm 25 November 2014 10:52:57PM 3 points [-]

The same way as an infohazard for any other intelligence: acausally threaten to destroy lots of paperclips, maybe even uncurl them, maybe even uncurl them while they were still holding a stack of pap-ARRRRGH I'LL DO WHATEVER YOU WANT JUST DON'T HURT THEM PLEASE

Comment author: philh 22 November 2014 03:15:38PM 4 points [-]

But a plush paperclip would probably not hold its shape very well, and become a plush basilisk.

Comment author: Tenoke 22 November 2014 05:38:15PM 26 points [-]

Close enough

Comment author: chaosmage 22 November 2014 11:11:24AM 11 points [-]

I'd prefer a paperclip dispenser with something like "Paperclip Maximizer (version 0.1)" written on it.

Comment author: Brillyant 22 November 2014 02:51:13AM 8 points [-]

Hm. Turn your weakness into a plush toy then sell it to raise money and disarm your critics. Winning.