Yvain comments on xkcd on the AI box experiment - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (229)
In his defense, is it possible EY can't win at this point, regardless of his approach? Maybe the internet has grabbed this thing and the PR whirlwinds are going to do with it whatever they like?
I've read apologies from EY where he seems to admit pretty clearly he screwed up. He comes off as defensive and pissy sometimes in my opinion, but he seems sincerely irked about how RW and other outlets have twisted to whole story to discredit LW and himself. From my recall, one comment he made on the reddit sub dedicated to his HP fanfic indicated he was very hurt by the whole kerfuffle, in addition to his obvious frustration.
At this point I think the winning move is rolling with it and selling little plush basilisks as a MIRI fundraiser. It's our involuntary mascot, and we might as well 'reclaim' it in the social justice sense.
Then every time someone brings up "Less Wrong is terrified of the basilisk" we can just be like "Yes! Yes we are! Would you like to buy a plush one?" and everyone will appreciate our ability to laugh at ourselves, and they'll go back to whatever they were doing.
Hm. Turn your weakness into a plush toy then sell it to raise money and disarm your critics. Winning.
Excellent idea. I would buy that, especially if it has a really bizarre design.
I'd like merchandise-based tribal allegiance membership signalling items anyway. Anyone selling MIRI mugs or LessWrong T-shirts can expect money from me.
Blasphemy, our mascot is a paperclip.
I'd prefer a paperclip dispenser with something like "Paperclip Maximizer (version 0.1)" written on it.
But a plush paperclip would probably not hold its shape very well, and become a plush basilisk.
Close enough
I feel the need to switch from Nerd Mode to Dork Mode and ask:
Which would win in a fight, a basilisk or a paperclip maximizer?
Paperclip maximizer, obviously. Basilisks typically are static entities, and I'm not sure how you would go about making a credible anti-paperclip 'infohazard'.
That depends entirely on what the PM's code is. If it doesn't include input sanitizers, a buffer overflow attack could suffice as a basilisk. If your model of a PM basilisk is "Something that would constitute a logical argument that would harm a PM", then you're operating on a very limited understanding of basilisks.
The same way as an infohazard for any other intelligence: acausally threaten to destroy lots of paperclips, maybe even uncurl them, maybe even uncurl them while they were still holding a stack of pap-ARRRRGH I'LL DO WHATEVER YOU WANT JUST DON'T HURT THEM PLEASE
"selling little plush basilisks as a MIRI fundraiser."
By "selling", do you mean giving basilisks to people who give money? It seems like a more appropriate policy would be giving a plush basilisk to anyone who doesn't give money.
Sound like the first step a Plush Basilisk Maximizer would take... :-D
I'd buy this. We can always use more stuffies.
Yes, brilliant idea!
It should be a snake, only with little flashing LEDs in its eyes.
The canonical basilisk paralyzes you if you look at it. Flickering lights carry the danger of triggering photosensitive epilepsy, and thus are sort of real-life basilisks. Even if the epilepsy reference is lost on many, it's still clearly a giant snake thing with weird eyes and importantly you can probably get from somewhere without having to custom make them.
(AFAIK Little LEDs should be too small to actually represent a threat to epileptics, and it shouldn't be any worse than any of the other flickering lights.)
EDIT: Eh, I suppose it could also be stuffed with paperclips or something, if we want to pack as many memes in as possible.
We can save money by re-coloring the plush Cthulhu. It's basically the same, right? :-)
alternatively sell empty boxes labelled "Don't look!"