You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Armok_GoB comments on Irrationality Game II - Less Wrong Discussion

13 [deleted] 03 July 2012 06:50PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (380)

You are viewing a single comment's thread.

Comment author: Armok_GoB 04 July 2012 07:17:44PM *  38 points [-]

IRRATIONALITY GAME

Eliezer Yudovsky has access to a basilisk kill agent that allows him to with a few clicks untraceably assassinate any person he can get to read a short email or equivalent, with comparable efficiency to what is shown in Deathnote.

Probability: improbable ( 2% )

Comment author: maia 06 July 2012 04:20:52PM 13 points [-]

This seems like a sarcastic Eliezer Yudkowsky Fact, not a serious Irrationality Game entry.

Comment author: faul_sname 04 July 2012 09:11:27PM 13 points [-]

Upvoted for enormous overconfidence that a universal basilisk exists.

Comment author: Armok_GoB 05 July 2012 12:36:48AM 0 points [-]

Never said it was a single universal one. And a lot of those 2% is meta uncertainty from doing the math sloppily.

The part where I think I might do better is having been on the receiving end of weaker basilisks and having some vague idea of how to construct something like it. That last part is the tricky one stopping me from sharing the evidence as it'd make it more likely a weapon like that falls into the wrong hands.

Comment author: faul_sname 05 July 2012 03:02:39AM 5 points [-]

The thing about basilisks is that they have limited capacity for causing actual death. Particularly among average people who get their cues of whether something is worrying from the social context (e.g. authority figures or their social group).

Comment author: Armok_GoB 05 July 2012 01:52:48PM 1 point [-]

Must... resist... revealing... info.... that... may... get... people.... killed.

Comment author: faul_sname 05 July 2012 02:49:22PM 3 points [-]

Please do resist. If you must tell someone, do it through private message.

Comment author: Armok_GoB 05 July 2012 07:17:13PM 1 point [-]

Yea. It's not THAT big a danger, I'm just trying to make it clear why I hold a belief not based of evidence that I can share.

Comment author: Davorak 09 July 2012 09:04:17PM *  3 points [-]

Speculating that your evidence is a written work that has driven multiple people to suicide, further that the written work was targeted to an individual and happened to kill other susceptible people who happened to read it. I would still rate 2% as overconfident.

Specifically the claim of universality, that "any person" can be killed by reading a short email is over confident. Two of your claims that seem to contradict are, the claim that "any one" and "with a few clicks", this suggests that special or in depth knowledge of the individual is unnecessary which suggest some level of universality, and the claim "Never said it was a single universal one." Though my impression is that you lean towards hand crafted basilisks targeted towards individuals or groups of similar individuals, but the contradiction lowered my estimate of this being corrected.

Such hand crafted basilisks indicates the ability to correctly model people to an exceptional degree and experiment with said model until an input can be found which causes death. I have considered other alternative explanations but found them unlikely if you rate another more realistic let me know.

Given this ability could be used for a considerable number task other then causing death, strongly influence elections, legislation, research directions of AI researchers or groups, and much more. If EY possessed this power how would you expect the world to be different then one where he does not?

Comment author: Armok_GoB 29 July 2012 07:57:11PM 1 point [-]

I don't remember this post. Weird. I've updated on it thou; my evidence is indeed even weaker than that,a nd you are absolutely correct in every point. I've updated to the point where my own estimate and my estimation of the comunitys estimate are indistinguishable.

Comment author: Davorak 31 July 2012 07:24:34PM *  1 point [-]

Interesting, I will be more likely to reply to messages that I feel end the conversation like your last one on this post:

It feels like this one caused my to update far more in the direction f basilisks being unlikely than anything else in this thread, although I don't know exactly how much.

maybe 12-24 hours later just in case the likelihood of update has been reduced by one or both parties having a late night conversation or other mind altering effects.

Comment author: Armok_GoB 09 July 2012 11:27:08PM 1 point [-]

It feels like this one caused my to update far more in the direction f basilisks being unlikely than anything else in this thread, although I don't know exactly how much.

Comment author: Eliezer_Yudkowsky 09 July 2012 06:16:54PM 7 points [-]

This seems like a clear example of "You shouldn't adjust the probability that high just because you're trying to avoid overconfidence; that's privileging a complicated possibility."

Comment author: wedrifid 09 July 2012 09:45:48PM 2 points [-]

This seems like a clear example of "You shouldn't adjust the probability that high just because you're trying to avoid overconfidence; that's privileging a complicated possibility."

Has there been a post on this subject yet? Handling overconfidence in that sort of situation is complicated.

Comment author: Eliezer_Yudkowsky 09 July 2012 10:24:13PM 1 point [-]
Comment author: wedrifid 10 July 2012 12:42:24AM 1 point [-]

Thanks! I recall reading that one but didn't recall.

It still leaves me with some doubt about how to handle uncertainty around the extremes without being pumpable or sometimes catastrophically wrong. I suppose some of that is inevitable given hardware that is both bounded and corrupted but I rather suspect there is some benefit to learning more. There's probably a book or ten out there I could read.

Comment author: [deleted] 31 July 2012 07:53:37PM 0 points [-]

Reading this comment made me slightly update my probability that the parent, or a weaker version thereof, is correct.

Comment author: Armok_GoB 09 July 2012 11:21:07PM *  0 points [-]

It may or may not be an example, but it's certainly not a clear one to me. Please explain? The entire sentence seems nonsensical, I know that the individual words mean but not how to apply them to the situation. Is this just some psychological effect because it targets a statement I personally made? It certainly doesn't feel like it but...

Edit: Figured out what I misunderstood. I modelled as .02 positive confidence not .98 negative confidence.

Comment author: Psy-Kosh 10 July 2012 03:22:43AM 9 points [-]

2% is way way way WAY too high for something like that. You shouldn't be afraid to assign a probability much closer to 0.

Comment author: John_Maxwell_IV 06 July 2012 05:03:07AM 11 points [-]

If such a universal basilisk exists, wouldn't it almost by definition kill the person who discovered it?

I think it's vaguely plausible such a basilisk exists, but I also think you are suffering from the halo effect around EY. Why would he of all people know about the basilisk? He's just some blogger you read who says things as though they are Deep Wisdom so people will pay attention.

Comment author: Armok_GoB 06 July 2012 03:08:22PM 0 points [-]

There are a bunch of tricks that lets you immunize yourself to classes of basilisks, without having access to the specific basilisk- sort of like vaccination, you deliberately infect yourself with a non-lethal variant first.

Eliezer has demonstrated all the skills needed to construct basilisks, is very smart, and have shown to recognize the danger of basilisks. I don't think that's a very common combination, but conditional on eliezer having basilisk weapons most others fitting that description equally well probably do as well.

Comment author: FiftyTwo 06 July 2012 09:04:00PM 8 points [-]

Wouldn't the world be observably different if everyone of EY's intellectual ability or above had access to a basilisk kill agent? And wouldn't we expect a rash of inexplicable deaths in people who are capable of constructing a basilisk but not vaccinating themselves?

Comment author: Eliezer_Yudkowsky 09 July 2012 06:19:15PM 9 points [-]

Not necessarily. If I did, in fact, possess such a basilisk, I cannot think offhand of any occasion where I would have actually used it. Robert Mugabe doesn't read my emails, it's not clear that killing him saves Zimbabwe, I have ethical inhibitions that I consider to exist for good reasons, and have you thought about what happens if somebody else glances at the computer screen afterward, and resulting events lead to many agents/groups possessing a basilisk?

Comment author: wedrifid 09 July 2012 09:34:35PM *  4 points [-]

and have you thought about what happens if somebody else glances at the computer screen afterward, and resulting events lead to many agents/groups possessing a basilisk?

It would guarantee drastic improvements in secure, trusted communication protocols and completely cure internet addiction (among the comparatively few survivors).

Comment author: TheOtherDave 06 July 2012 09:24:27PM 16 points [-]

Are basilisks necessarily fatal? If the majority of basilisks caused insanity or the loss of intellectual capacity instead of death, I would expect to see a large group of people who considered themselves capable of constructing basilisks, but who on inspection turned out to be crazy or not nearly that bright after all.

...

Oh, shit.

Comment author: FiftyTwo 06 July 2012 10:02:56PM 1 point [-]

Are basilisks necessarily fatal

The post specified fatal so I followed it.

For non-fatal basilisks we'd expect to see people flipping suddenly from highly intelligent and sane, to stupid and/or crazy. Specifically after researching basilisk related topics.

Comment author: Armok_GoB 06 July 2012 10:26:16PM 2 points [-]

Yes, this can also be reversed for a good way to see what topics are practically basilisk construction related.

Comment author: [deleted] 07 July 2012 07:59:38AM *  0 points [-]

Yes, but you would get false positives too, such as chess (scroll down to “Real Life” -- warning: TVTropes). Edited to fix link syntax -- how comes after all these months I still get it wrong this often?

Comment author: Armok_GoB 06 July 2012 10:23:43PM 0 points [-]

Yup, this is entirely correct. Learned that the hard way. Vastly so, with such weak basilisks constantly arising from random noise in the memepool, while even knowing how and having all the necessary ingredients a Eliezer-class mind is likely needed for a lethal one.

Great practice for FAI in a way, in that as soon as you make a single misstep you've lost everything forever and wont even know it. Don't try this at home.

Comment author: Armok_GoB 06 July 2012 10:29:32PM 0 points [-]

First of, there aren't nearly enough people for it to be any kind of "rash", secondly they must be researching a narrow range of topics where basilisks occur, thirdly they'd go insane and lose the basilisk creation capacity way before they got to deliberately lethal ones, and finally anyone smart enough to be able to do that is smart enough not to do it.

Comment author: TheOtherDave 04 July 2012 07:58:00PM 7 points [-]

Upvoted for vast overconfidence.
Downvoted back to zero because I suspect you're not following the rules of the thread.
Also, I have no idea who "Eliezer Yudovsky" is, though it doesn't matter for either of the above.

Comment author: [deleted] 23 January 2013 05:17:41PM 0 points [-]

Well, this is scary enough.

Comment author: Armok_GoB 05 July 2012 02:16:17PM 0 points [-]

I am way to good at this game. :(

I really didn't expect this to go this high. All the other posts get lots of helpful comments about WHY they were wrong. If I'm really wrong, which these upvotes indicate; I really need to know WHY so I know with connected beliefs to update as well.

Comment author: Jack 05 July 2012 07:52:16PM 12 points [-]

2% is too high a credence for belief in the existence of powers for which (as far as I know) not even anecdotal evidence exists. It's the realm of speculative fiction, well beyond the current ability of psychological and cognitive science and, one imagines, rather difficult to control.

But ascribing such a power to a specific individual who hasn't had any special connection to cutting edge brain science or DARPA and isn't even especially good at using conventional psychological weapons like 'charm' is what sends your entry into the realm of utter and astonishing absurdity.

Comment author: Will_Newsome 07 July 2012 04:42:15AM 4 points [-]

a specific individual who hasn't had any special connection to cutting edge brain science or DARPA

Not publicly, at least.

Comment author: Armok_GoB 06 July 2012 12:17:36AM 0 points [-]

2% is too high a credence for belief in the existence of powers for which (as far as I know) not even anecdotal evidence exists. It's the realm of speculative fiction, well beyond the current ability of psychological and cognitive science and, one imagines, rather difficult to control.

Many say exactly the same thing about cryonics. And lots of anecdotal evidence does exist, not of killing specifically, but of inducing a wide enough range of mental states that some within there are known to be lethal.

So far in my experience skill at basilisks is utterly tangential to the skills you mentioned, and fit Eliezers skill set extremely well. Further, he has demonstrated this type of abilities before, for example in the AI box experiments or HPMoR.

Comment author: Jack 06 July 2012 12:39:12AM 5 points [-]

Many say exactly the same thing about cryonics.

Pointing to cryonics anytime someone says you believe in something that is the realm of speculative fiction and well beyond current science is a really, really, bad strategy for having true beliefs. Consider the generality of your response.

And lots of anecdotal evidence does exist,

Show me three.

skill at basilisks

How is this even a thing? That you have experience with?

the AI box experiments

Your best point. But nearly enough to bring p up to 0.02.

Comment author: Armok_GoB 06 July 2012 01:19:04AM 2 points [-]

Point, it's not a strategy for arriving at truths, it's a snappy comeback at a failure mode I'm getting really tired of. The fact that something is in the realm of speculative fiction is not a valid argument in a world full of cyborgs, tablet computers, self driving cars, and casualty-defying decision theories. And yes, basilisks.

Show me three.

Um, we're talking basilisks here. SHOWING you'd be a bad idea. However, to NAME a few, there's the famous Roko incident, several MLP gorefics had basilisk like effects on some readers, and then there's techniques like http://www.youtube.com/watch?v=eNBBl6goECQ .

Yes, skill at basilisks is a thing, that I have some experience with.

finaly, not in response to anything in particular but sort of related: http://cognitiveengineer.blogspot.se/2011/11/holy-shit.html

Comment author: Jack 06 July 2012 01:50:18AM 7 points [-]

Point, it's not a strategy for arriving at truths, it's a snappy comeback at a failure mode I'm getting really tired of. The fact that something is in the realm of speculative fiction is not a valid argument in a world full of cyborgs, tablet computers, self driving cars, and casualty-defying decision theories. And yes, basilisks.

The argument isn't that because something is found in speculative fiction it can't be real; it's that this thing you're talking about isn't found outside of speculative fiction-- i.e. it's not real. Science can't do that yet. If you're familiar with the state of a science you have a good sense of what is and isn't possible yet. "A basilisk kill agent that allows him to with a few clicks untraceably assassinate any person he can get to read a short email or equivalent, with comparable efficiency to what is shown in Deathnote" is very likely one of those things. I mention "speculative fiction" because a lot of people have a tendency to privilege hypotheses they find in such fiction.

Hypnotism is not the same as what you're talking about. The Roko 'basilisk' is joke compared to what you're describing. None of these are anecdotal evidence for the power you are describing.

Comment author: Armok_GoB 06 July 2012 03:28:48PM 0 points [-]

Oh, illusion of transparency. Yea, that's at least a real argument.

There are plenty of things that individual geniuses can do that the institutions you seem to be referring to as "science" can't yet mass produce, especially in the reference class of things like works of fiction or political species which many basilisks belong to. "Science" also believes rational agents defect on the prisoners dilemma.

Also, while proposing something like deliberate successful government suppression would be clearly falling into the conspiracy theory failure mode, it none the less does seem like an extremely dangerous weapon, that sounds absurd when described, works through badly understood psychology only present in humans, and appropriately likely to be discovered by empathic extreme high elite of intellectuals, would be less likely to become public knowledge as quickly as most things.

And I kept to small scale not-very-dangerous pseudo basilisks on purpose, just in case someone decides to look them up. They are more relevant then you think thou.

Comment author: Jack 07 July 2012 02:36:49AM *  7 points [-]

And I kept to small scale not-very-dangerous pseudo basilisks on purpose, just in case someone decides to look them up. They are more relevant then you think thou.

I don't believe you. Look, obviously if you have secret knowledge of the existence of fatal basilisks that you're unwilling to share that's a good reason to have a higher credence than me. But I asked you for evidence (not even good evidence, just anecdotal evidence) and you gave me hypnotism and the silly Roko thing. Hinting that you have some deep understanding of basilisks that I don't is explained far better by the hypothesis that you're trying to cover for the fact that you made an embarrassingly ridiculous claim than by your actually having such an understanding. It's okay, it was the irrationality game. You can admit you were privileging the hypothesis.

"Science" also believes rational agents defect on the prisoners dilemma.

Again, pointing to a failure of science as a justification for ignoring it when evaluating the probability of a hypothesis is a really bad thing to do. You actually have to learn things about the world in order to manipulate the world. The most talented writers in the world are capable of producing profound and significant --but nearly always temporary-- emotional reactions in the small set of people that connect with them. Equating that with

A basilisk kill agent that allows him to with a few clicks untraceably assassinate any person he can get to read a short email or equivalent, with comparable efficiency to what is shown in Deathnote

is bizarre.

Also, while proposing something like deliberate successful government suppression would be clearly falling into the conspiracy theory failure mode, it none the less does seem like an extremely dangerous weapon, that sounds absurd when described, works through badly understood psychology only present in humans, and appropriately likely to be discovered by empathic extreme high elite of intellectuals, would be less likely to become public knowledge as quickly as most things.

A government possessing a basilisk and keeping it a secret is several orders of magnitude more likely than what you proposed. Governments have the funds and the will to both test and create weapons that kill. Also, "empathic" doesn't seem like a word that describes Eliezer well.

Anyway, I don't really think this conversation is doing anyone any good since debating absurd possibilities has the tendency to make them seem even more likely overtime as you'll keep running your sense-making system and come up with new and better justifications for this claim until you actually begin to think "wait, two percent seems kind of low!".

Comment author: Armok_GoB 07 July 2012 08:20:48PM 1 point [-]

Yea, that this thread is getting WAY to adversarial for my taste, dangerously so. At least we can agree on that.

Anyway, you did admit that sometimes, rarely, a really good writer can have permanent profound emotional reactions, and I suspect most of the disagreement here actually resides in the lethality of emotional reactions, and my taste for wording things to sound dramatic as long as they are still true.

Comment author: Multiheaded 13 August 2012 04:55:10AM *  6 points [-]

Yet another fictional story that features a rather impressive "emotional basilisk" of sorts; enough to both drive people in-universe insane or suicidal, AND make the reader (especially one prone to agonizing over morality, obsessive thoughts, etc) feel potentially bad distress. I know I did feel sickened and generally wrong for a few hours, and I've heard of people who took it worse.

SCP-231. I'm not linking directly to it, please consider carefully if you want to read it. Curiosity over something intellectually stimulating but dangerous is one thing, but this one is just emotional torment for torment's sake. If you've read SCP before (I mostly dislike their stuff), you might be guessing which one I'm talking about - so no need to re-read it, dude.

Comment author: MugaSofer 22 November 2012 01:44:26AM 5 points [-]

Really? That's had basilisk-like effects? I guess these things are subjective ... torturing one girl to save humanity is treated like this vast and terrible thing, with the main risk being that one day they wont be able to bring themselves to continue - but in other stories they regularly kill tons of people in horrible ways just to find out how something works. Honestly, I'm not sure why it's so popular, there are a bunch of SCPs that could solve it (although there could be some brilliant reason why they can't, we'll never know due to redaction.) But it's too popular to ever be decommissioned ... it makes the Foundation come across as lazy, not even trying to help the girl, too busy stewing in self-pity at the horrors they have to commit to actually stop committing them.

Wait, I'm still thinking about it after all this time? Hmm, perhaps there's something to this basilisk thing...

Comment author: [deleted] 16 August 2012 07:26:58PM *  4 points [-]

SCP-231. I'm not linking directly to it, please consider carefully if you want to read it. Curiosity over something intellectually stimulating but dangerous is one thing, but this one is just emotional torment for torment's sake. If you've read SCP before (I mostly dislike their stuff), you might be guessing which one I'm talking about - so no need to re-read it, dude.

Straw Utilitarian exclaims: "Ha easy! Our world has many tortured children, adding one more is a trival cost to pay for continued human existence." But yes imagining me being put in a position to decide on something like that caused me quite a bit of emotional distress. Trying to work out what I should do according to my ethical system (spaghetti code virtue ethics), honourable suicide and resignation seems a potentially viable option since my consequentialism infected brain cells yell at me for trying hair brained schemes to help the girl.

On a lighter note my favourite SCP.

The members of SCP-1845 are physiologically indistinct from normal animals of their species. However, the animals have been demonstrated to possess near-human intelligence, the ability to construct simple tools from objects in their habitat and introduced by the Foundation, and a system of government modeled on medieval European feudalism.

...

CP-1845-1 is the "leader" of the colony and the only member of the group observed to be able to use the installed keyboard. SCP-1845-1 considers itself to be of royal heritage and identifies itself using the title "His Royal Highness, Eugenio the Second, by the Grace of God, King of the Forest, Lord of the Plains, Duke of the Grand Fir and the Undergrowth, Count of the Swamp, Margrave of ██ ███████, Warden of All the Streams and Rivers, and Lord Protector of the Cities of Man, Defender of the Faith." SCP-1845-1 identifies itself and its followers as Roman Catholics and appears to be extremely pious in its devotions - it has been observed on video praying over its meals and observing holidays and saintly feast days, and has been observed to order punishments against other members of the colony for perceived lack of piety.

...

SCP-1845-1 has asserted that it was not responsible for the "war" that led to its discovery and capture, and that it was retaliating against an uprising on the part of one of its "subjects", a Columbian black-tailed deer (Odocoileus hemionus columbianus) it identified as "Duke Baxter of the West Bay." SCP-1845-1 spoke vitriolically of said deer, describing it as "a most uncouth usurper, rogue, and Protestant" who it claimed had, "having accused them falsely of witchcraft, assassinated our Queen Consort, the Prince of █████ █████, and our other royal issue", and of turning a large portion of the nobility and peasantry against it. It insists that the deer is still at large and marshalling its forces against its nation, and that once it is released from captivity it will defeat it. No deer matching the description given by SCP-1845-1 is among the members of SCP-1845 or was found among those killed during the raid.

Ah the entry is tragically incomplete!

The Catholic faith of the animals was not surprising since contact with SPC-3471 by agent ███ █████ and other LessWrong Computational Theology division cell members have received proof of Catholicism's consistency under CEV as well as indications it represented a natural Schelling point of mammalian morality. First pausing to praise the sovereigns taste in books, the existence of Protestantism has lead Dr. █████ █████ to speculate SPC-4271 ("w-force") has become active in species besides Homo Sapiens violating the 2008 Trilateral Blogosphere Accords. He advises full military support to Eugenio the Second in stamping out rebellion and termination of all animals currently under the rule under Duke Baxter of the West Bay.

"Kill them all. For SPC-3471 knows them that are His."

Adding:

"Nuke the site from orbit, its the only way to be sure."

Comment author: Multiheaded 17 August 2012 08:46:52AM 1 point [-]

Yep, suicide is probably what I'd do as well, personally, but the story itself is incoherent (as noted in the page discussion) and even without resorting to other SCPs there seem to be many, many alternatives to consider (at the very least they could have made the torture fully automated!). As I've said, it's constructed purely as horror/porn and not as an ethical dilemma.

BTW simply saying that "Catholicism" is consistent under something or other is quite meaningless, as "C." doesn't make for a very coherent system as seen through Papal policy and decisions of any period. Will would've had to point to a specific eminent theologian, like Aquinas, and then carefully choose where and how to expand - for now, Will isn't doing much with his "Catholicism" strictly speaking, just writing emotionally tinged bits of cosmogony and game theory.

Comment author: [deleted] 17 August 2012 09:45:45AM *  2 points [-]

Yep, suicide is probably what I'd do as well, personally, but the story itself is incoherent (as noted in the page discussion) and even without resorting to other SCPs there seem to be many, many alternatives to consider (at the very least they could have made the torture fully automated!). As I've said, it's constructed purely as horror/porn and not as an ethical dilemma.

I mentally iron man such details when presented with such scenarios. Often its the only way for me to keep suspension of disbelief and continue to enjoy fiction. To give a trivial fix to your nitpick, the ritual requires not only the suffering of the victim to be undiminished but also the sexual pleasure of the torturer and/or rapist to be present, automating it is therefore not viable.

BTW simply saying that "Catholicism" is consistent under something or other is quite meaningless, as "C." doesn't make for a very coherent system as seen through Papal policy and decisions of any period. Will would've had to point to a specific eminent theologian, like Aquinas, and then carefully choose where and how to expand - for now, Will isn't doing much with his "Catholicism" strictly speaking, just writing emotionally tinged bits of cosmogony and game theory.

Do not overanalyse the technobabble it ruins suspension of disbelief. And what is a SPC without technobabble? Can I perhaps then interest you in a web based Marxist state?

Also who is this Will? I deny all knowledge of him!

Comment author: Multiheaded 22 November 2012 01:31:02PM 0 points [-]

Trying to work out what I should do according to my ethical system (spaghetti code virtue ethics), honourable suicide and resignation seems a potentially viable option

An agent placed in similar circumstances before did just that.

Comment author: Multiheaded 18 July 2012 12:17:48PM 6 points [-]

I do not know with what weapons World War III will be fought, but World War IV will be fought with fairytales about talking ponies!

Comment author: Armok_GoB 18 July 2012 08:06:05PM 1 point [-]

I love you so much right now. :D

Comment author: MixedNuts 11 July 2012 10:37:52AM 5 points [-]

I have a solid basilisk-handling procedure. (Details available on demand.) You or anyone is welcome to send me any basilisk in the next 24 hours, or at any point in the future with 12 hours warning. I'll publish how many different basilisks I've received, how basilisky I found them, and nothing else.

Evidence: I wasn't particularly shaken by Roko's basilisk. I found Cupcakes a pretty funny read (thanks for the rec!). I have lots of experience blocking out obsessive/intrusive thoughts. I just watched 2girls1cup while eating. I'm good at keeping non-basilisk secrets.

Comment author: [deleted] 31 July 2012 08:11:57PM 0 points [-]

Has anyone sent you any basilisk so far?

Comment author: MixedNuts 01 August 2012 08:58:30AM 0 points [-]

No, I'm all basilisk-less and forlorn. :( I stumbled on a (probably very personal) weak basilisk on my own. Do people just not trust me or don't they have any basilisks handy?

Comment author: Mitchell_Porter 01 August 2012 11:18:03AM 0 points [-]

How do you define basilisk? What effect is it supposed to have on you?

Comment author: wedrifid 01 August 2012 11:12:20AM 0 points [-]

Do people just not trust me or don't they have any basilisks handy?

The latter. Or, if the former, they don't trust you not to just laugh at what they provide and dismiss it.

Comment author: Dorikka 10 July 2012 02:15:39AM 2 points [-]

MLP gorefics

I am amused and curious. :P Did the basilisk-sharing list ever get off the ground?

Comment author: Armok_GoB 10 July 2012 02:35:21AM 0 points [-]

Not that I know of, and it's much less interesting then it sounds. Just nausea and permanent inability to enough the show in a small percent of readers of Cupcakes and the like.

Comment author: Multiheaded 20 July 2012 09:08:28PM 1 point [-]

Also, always related to any basilisk discussion:

The Funniest Joke In The World

Comment author: MugaSofer 22 November 2012 02:00:33AM 1 point [-]

I was about to condescendingly explain that there's simply no reason to posit such a thing, when it started making far too much sense for my liking. That said, untraceable? How?

Comment author: Armok_GoB 22 November 2012 05:21:55PM 1 point [-]

Email via proxy, some incubation time, looks like normal depression followed by suicide.

Comment author: MugaSofer 22 November 2012 07:19:05PM 1 point [-]

Of course. I was assuming a near-instant effect for some reason.

On the plus side, he doesn't seem to have used it to remove anyone blocking progress on FAI ...