You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

faul_sname comments on Irrationality Game II - Less Wrong Discussion

13 [deleted] 03 July 2012 06:50PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (380)

You are viewing a single comment's thread. Show more comments above.

Comment author: faul_sname 04 July 2012 09:11:27PM 13 points [-]

Upvoted for enormous overconfidence that a universal basilisk exists.

Comment author: Armok_GoB 05 July 2012 12:36:48AM 0 points [-]

Never said it was a single universal one. And a lot of those 2% is meta uncertainty from doing the math sloppily.

The part where I think I might do better is having been on the receiving end of weaker basilisks and having some vague idea of how to construct something like it. That last part is the tricky one stopping me from sharing the evidence as it'd make it more likely a weapon like that falls into the wrong hands.

Comment author: faul_sname 05 July 2012 03:02:39AM 5 points [-]

The thing about basilisks is that they have limited capacity for causing actual death. Particularly among average people who get their cues of whether something is worrying from the social context (e.g. authority figures or their social group).

Comment author: Armok_GoB 05 July 2012 01:52:48PM 1 point [-]

Must... resist... revealing... info.... that... may... get... people.... killed.

Comment author: faul_sname 05 July 2012 02:49:22PM 3 points [-]

Please do resist. If you must tell someone, do it through private message.

Comment author: Armok_GoB 05 July 2012 07:17:13PM 1 point [-]

Yea. It's not THAT big a danger, I'm just trying to make it clear why I hold a belief not based of evidence that I can share.

Comment author: Davorak 09 July 2012 09:04:17PM *  3 points [-]

Speculating that your evidence is a written work that has driven multiple people to suicide, further that the written work was targeted to an individual and happened to kill other susceptible people who happened to read it. I would still rate 2% as overconfident.

Specifically the claim of universality, that "any person" can be killed by reading a short email is over confident. Two of your claims that seem to contradict are, the claim that "any one" and "with a few clicks", this suggests that special or in depth knowledge of the individual is unnecessary which suggest some level of universality, and the claim "Never said it was a single universal one." Though my impression is that you lean towards hand crafted basilisks targeted towards individuals or groups of similar individuals, but the contradiction lowered my estimate of this being corrected.

Such hand crafted basilisks indicates the ability to correctly model people to an exceptional degree and experiment with said model until an input can be found which causes death. I have considered other alternative explanations but found them unlikely if you rate another more realistic let me know.

Given this ability could be used for a considerable number task other then causing death, strongly influence elections, legislation, research directions of AI researchers or groups, and much more. If EY possessed this power how would you expect the world to be different then one where he does not?

Comment author: Armok_GoB 29 July 2012 07:57:11PM 1 point [-]

I don't remember this post. Weird. I've updated on it thou; my evidence is indeed even weaker than that,a nd you are absolutely correct in every point. I've updated to the point where my own estimate and my estimation of the comunitys estimate are indistinguishable.

Comment author: Davorak 31 July 2012 07:24:34PM *  1 point [-]

Interesting, I will be more likely to reply to messages that I feel end the conversation like your last one on this post:

It feels like this one caused my to update far more in the direction f basilisks being unlikely than anything else in this thread, although I don't know exactly how much.

maybe 12-24 hours later just in case the likelihood of update has been reduced by one or both parties having a late night conversation or other mind altering effects.

Comment author: Armok_GoB 01 August 2012 03:56:03PM 0 points [-]

Good idea, please do that.

Comment author: Armok_GoB 09 July 2012 11:27:08PM 1 point [-]

It feels like this one caused my to update far more in the direction f basilisks being unlikely than anything else in this thread, although I don't know exactly how much.