You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

khafra comments on [LINK] Terrorists target AI researchers - Less Wrong Discussion

24 Post author: RobertLumley 15 September 2011 02:22PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (32)

You are viewing a single comment's thread. Show more comments above.

Comment author: khafra 15 September 2011 05:02:27PM 5 points [-]

Somebody mentioned Aleister Crowley's quotes on LW a little while ago; so:

There seems to be much misunderstanding about True Will ... The fact of a person being a gentleman is as much an ineluctable factor as any possible spiritual experience; in fact, it is possible, even probable, that a man may be misled by the enthusiasm of an illumination, and if he should find apparent conflict between his spiritual duty and his duty to honour, it is almost sure evidence that a trap is being laid for him and he should unhesitatingly stick to the course which ordinary decency indicates ... I wish to say definitely, once and for all, that people who do not understand and accept this position have utterly failed to grasp the fundamental principles of the Law of Thelema.

-- Magical Diaries of Aleister Crowley : Tunisia 1923 (1996), edited by Stephen Skinner p.21

Comment author: Yvain 15 September 2011 05:24:29PM *  9 points [-]

If one is skeptical of the existence of Thelema or of the validity of these spiritual experiences, then this sounds a lot like religious leaders who say "Sure, believe in Heaven. But don't commit suicide to get there faster. Or commit homicide to get other people there faster. Or do anything else that contradicts ordinary decency."

Part of the fun of being right is that when your system contradicts ordinary decency, you get to at least consider siding with your system.

(although hopefully if your system is right you will choose not to, for the right reasons.)

Comment author: Nornagest 15 September 2011 10:32:45PM *  6 points [-]

My Crowley background is pretty spotty, but I read that as him generalizing over ethical intersections with religious experience and then specializing to his own faith. It's not entirely unlike some posts I've read here, in fact; the implication seems to be that if some consequence of your religious (i.e. axiomatic; we could substitute decision-theoretic or similarly fundamental) ethics seems to suggest gross violations of common ethics, then it's more likely that you've got the wrong axioms or forgot to carry the one somewhere than that you need to run out and (e.g.) destroy all humans. Which is very much what I'd expect from a rationalist analysis of the topic.

Comment author: Dr_Manhattan 15 September 2011 06:06:33PM 5 points [-]

Here is an intuition pump: you see a baby who got hold of his dad's suitcase nuke and is about to destroy the city. Do you prevent him from pushing the button, even by lethal means? If the answer is yes, then consider Richard's original question, and confirm that the differences in the two situations stand up to reverse your decision.

Comment author: khafra 15 September 2011 06:10:37PM *  1 point [-]

On the one hand, yes; on the other hand, I do think I take the risks from UFAI seriously, and have some relevant experience and skill, but still wouldn't participate in a paramilitary operation against a GAI researcher.

edit: On reflection, this is due to my confidence in my ability to correctly predict the end of the world, and the problem of multiplying low probabilities by large utilities.

Comment author: Dr_Manhattan 16 September 2011 08:47:01PM 1 point [-]

On reflection, this is due to my confidence in my ability to correctly predict the end of the world, and the problem of multiplying low probabilities by large utilities.

You mean lack of confidence, right?

Comment author: Dr_Manhattan 15 September 2011 05:17:02PM 4 points [-]

unhesitatingly stick to the course which ordinary decency indicates

Extraordinary situations call for extraordinary decency